Giter Site home page Giter Site logo

Comments (18)

abisee avatar abisee commented on June 27, 2024

I tend to see this kind of output in the earlier phases of training (i.e. when the model is still under-trained). Look at the loss curve on tensorboard -- has the loss decreased much? It may be that the model needs further training.

from pointer-generator.

fishermanff avatar fishermanff commented on June 27, 2024

Thanks @abisee , seems you are right. I found the loss is still high, the model needs more training steps.

from pointer-generator.

makcbe avatar makcbe commented on June 27, 2024

@abisee: thank you and this is a great piece of work.
@fishermanff, are you able to tell what a high loss is like? Going with instructions, I am running train and eval concurrently. Is it correct? Also, suggestions on when to stop the training please? Is it ok to stop when the loss stops reducing any further? thank you.

from pointer-generator.

abisee avatar abisee commented on June 27, 2024

@makcbe Yes, the eval mode is designed to be run concurrently with train mode. The idea is you can see the loss on the validation set plotted alongside the loss on the training set in Tensorboard, helping you to spot overfitting etc.

About when to stop training: there's no easy answer for this. You might keep training until you find that the loss on the validation set is not reducing any more. You might find that after some time your validation set loss starts to rise while your training set loss reduces further (overfitting). In that case you want to stop training. If your loss function has gone flat you can try lowering your learning rate.

In any case you should run decode mode and look at some generated summaries. The visualization tool will make this much more informative.

from pointer-generator.

fishermanff avatar fishermanff commented on June 27, 2024

@makcbe hi, the author has answered the right things. I have run the training mode over 16.77k steps (see from tensorboard), the loss value is about 4.0; then I found the generated summaries have some correct outputs but the total performance is still far away from the ACL results. Hence, I think further steps are needed.

from pointer-generator.

makcbe avatar makcbe commented on June 27, 2024

Both, thank you for the support and that's definitely useful.

from pointer-generator.

 avatar commented on June 27, 2024

Hi @fishermanff @abisee ,

when I trained for 3k steps, I saw the generated summary began to repeat the first sentence of the whole text. Did that happen to you?

Thanks.

from pointer-generator.

 avatar commented on June 27, 2024

Hi @fishermanff @abisee,

when I trained to 40k steps. The results turn to be INFO:tensorflow:GENERATED SUMMARY: [UNK] [UNK] [UNK] [UNK] [UNK] [UNK] [UNK] ....
Just want to make sure further training will make it better.

from pointer-generator.

fishermanff avatar fishermanff commented on June 27, 2024

@lilyzl what's the loss, does it converge? see in tensorboard.

from pointer-generator.

 avatar commented on June 27, 2024

@fishermanff Thanks for replying.
The [UNK] results are due to the NAN in the loss. I fixed it based on previous issue solutions.
Another question I have is that is the generated summary length variable? I set the minimum length to 30, then all results becomes 30 tokens. How should I deal with that?
Thanks a lot!

from pointer-generator.

fishermanff avatar fishermanff commented on June 27, 2024

@lilyzl maybe you can stop decoding in your code when the decoder reach STOP token.

from pointer-generator.

abisee avatar abisee commented on June 27, 2024

Hi @lilyzl

  1. Yes, repetition is very common (it is one of the two big things we are aiming to fix as noted in the ACL paper). That's what the coverage setting is for - to reduce repetition.
  2. Yes, the generated summary length is variable. It's generated using beam search. Essentially it keeps producing tokens until it produces the STOP token. I'm not sure why your decoded summaries are all length 30 if your minimum length is 30. Have a look at the code in beam_search.py.

from pointer-generator.

fishermanff avatar fishermanff commented on June 27, 2024

Hi @abisee
I have trained the model for 80k steps, and then I press Ctrl+C to terminate the training process. I am confused that if the variables saved in logs/train/ file would be restored automatically when I rerun run_summarization.py in 'train' mode? Or I need to add some code like "tf.train.Saver.restore()" by myself to restore the pre-train variables?

from pointer-generator.

abisee avatar abisee commented on June 27, 2024

Hi @fishermanff

Yes, running run_summarization.py in train mode should restore your last training checkpoint. I think it's handled by the supervisor.

from pointer-generator.

fishermanff avatar fishermanff commented on June 27, 2024

Thanks @abisee , copy that

from pointer-generator.

adowu avatar adowu commented on June 27, 2024

Hi @fishermanff @abisee,

when I trained to 40k steps. The results turn to be INFO:tensorflow:GENERATED SUMMARY: [UNK] [UNK] [UNK] [UNK] [UNK] [UNK] [UNK] ....
Just want to make sure further training will make it better.

@fishermanff Thanks for replying.
The [UNK] results are due to the NAN in the loss. I fixed it based on previous issue solutions.
Another question I have is that is the generated summary length variable? I set the minimum length to 30, then all results becomes 30 tokens. How should I deal with that?
Thanks a lot!

hello, i also get [UNK] in mu SUMMARY result. could you tell me how to solve this problem? i found nothing in previous issues. Thanks a lot

from pointer-generator.

JenuTandel avatar JenuTandel commented on June 27, 2024

Hi @fishermanff @abisee,
when I trained to 40k steps. The results turn to be INFO:tensorflow:GENERATED SUMMARY: [UNK] [UNK] [UNK] [UNK] [UNK] [UNK] [UNK] ....
Just want to make sure further training will make it better.

@fishermanff Thanks for replying.
The [UNK] results are due to the NAN in the loss. I fixed it based on previous issue solutions.
Another question I have is that is the generated summary length variable? I set the minimum length to 30, then all results becomes 30 tokens. How should I deal with that?
Thanks a lot!

hello, i also get [UNK] in mu SUMMARY result. could you tell me how to solve this problem? i found nothing in previous issues. Thanks a lot

Hi @fishermanff @abisee,
when I trained to 40k steps. The results turn to be INFO:tensorflow:GENERATED SUMMARY: [UNK] [UNK] [UNK] [UNK] [UNK] [UNK] [UNK] ....
Just want to make sure further training will make it better.

@fishermanff Thanks for replying.
The [UNK] results are due to the NAN in the loss. I fixed it based on previous issue solutions.
Another question I have is that is the generated summary length variable? I set the minimum length to 30, then all results becomes 30 tokens. How should I deal with that?
Thanks a lot!

hello, i also get [UNK] in mu SUMMARY result. could you tell me how to solve this problem? i found nothing in previous issues. Thanks a lot

I have also same kind of problem. If you have any solution then suggest me.

from pointer-generator.

GaneshDoosa avatar GaneshDoosa commented on June 27, 2024

Hi @fishermanff @abisee,
when I trained to 40k steps. The results turn to be INFO:tensorflow:GENERATED SUMMARY: [UNK] [UNK] [UNK] [UNK] [UNK] [UNK] [UNK] ....
Just want to make sure further training will make it better.

@fishermanff Thanks for replying.
The [UNK] results are due to the NAN in the loss. I fixed it based on previous issue solutions.
Another question I have is that is the generated summary length variable? I set the minimum length to 30, then all results becomes 30 tokens. How should I deal with that?
Thanks a lot!

hello, i also get [UNK] in mu SUMMARY result. could you tell me how to solve this problem? i found nothing in previous issues. Thanks a lot

Hi @fishermanff @abisee,
when I trained to 40k steps. The results turn to be INFO:tensorflow:GENERATED SUMMARY: [UNK] [UNK] [UNK] [UNK] [UNK] [UNK] [UNK] ....
Just want to make sure further training will make it better.

@fishermanff Thanks for replying.
The [UNK] results are due to the NAN in the loss. I fixed it based on previous issue solutions.
Another question I have is that is the generated summary length variable? I set the minimum length to 30, then all results becomes 30 tokens. How should I deal with that?
Thanks a lot!

hello, i also get [UNK] in mu SUMMARY result. could you tell me how to solve this problem? i found nothing in previous issues. Thanks a lot

I have also same kind of problem. If you have any solution then suggest me.

Did anyone solve this UNK problem?

from pointer-generator.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.