-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Finetune Transformers Models with PyTorch Lightning: documentation error? #139
Comments
I guess it should be |
Is |
okay. yes... I didn't see max_epochs there. should be something like
|
So do you also think there is an error in the documentation too? I am not sure by my own. |
@stancld mind have a look, please? 🐿️ |
@Borda Yes, I will have a look on Friday (tmrw). Can you assign the issue to me, please? :] |
It looks like no such example is present in tutorial anymore 🤔 |
https://github.com/Lightning-AI/tutorials/tree/main/lightning_examples/text-transformers |
Oh sorry, I have forked |
@Borda I checked the notebook and it looks like the calculation of the total number of training steps is now the responsibility of a Lightning Trainer. The reported error is, therefore, no more relevant for this example, and I believe the issue can be closed/marked as done. |
When calculating the total steps, shouldn't we use
number of batches * epoch size
? In this case, it would beself.total_steps = (len(train_loader.dataset) // tb_size) * ab_size
instead ofself.total_steps = (len(train_loader.dataset) // tb_size) // ab_size
.Please fix me if anywhere is wrong.
https://pytorchlightning.github.io/lightning-tutorials/notebooks/lightning_examples/text-transformers.html
cc @Borda @rohitgr7
The text was updated successfully, but these errors were encountered: