-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix overfit_batches
when using with multiple val/test_dataloaders
#2792
Fix overfit_batches
when using with multiple val/test_dataloaders
#2792
Conversation
Codecov Report
@@ Coverage Diff @@
## master #2792 +/- ##
======================================
- Coverage 91% 90% -0%
======================================
Files 76 76
Lines 6787 6793 +6
======================================
- Hits 6150 6135 -15
- Misses 637 658 +21 |
overfit_batches
when using with multiple val_dataloaders
and test_dataloaders
overfit_batches
when using with multiple *_dataloaders
overfit_batches
when using with multiple *_dataloaders
overfit_batches
when using with multiple {val,test}_dataloaders
It is ready to review. |
overfit_batches
when using with multiple {val,test}_dataloadersoverfit_batches
when using with multiple {val,test}_dataloaders
This pull request is now in conflict... :( |
Hello @ydcjeff! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2020-09-22 16:54:03 UTC |
overfit_batches
when using with multiple {val,test}_dataloadersoverfit_batches
when using with multiple {val,test}_dataloaders
Ready to review. One thing this PR can't cover is if there is
Then, it will continue training with |
output = self.trainer.accelerator_backend.test_step(args) | ||
else: | ||
output = self.trainer.accelerator_backend.validation_step(args) | ||
except TypeError: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what if the typeerror is coming from some other bug? it will make it almost impossible to debug this because an error will occur later, unrelated to this part of the code here. can we not just check if step takes multiple dataloaders?
related: #2266
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We only have access to dataloader_idx
, val_dataloader
and test_dataloader
become train_dataloader
if there is overfit_batches
overfit_batches
when using with multiple {val,test}_dataloadersoverfit_batches
when using with multiple val/test_dataloaders
overfit_batches
when using with multiple val/test_dataloadersoverfit_batches
when using with multiple val/test_dataloaders
This issue can be fixed if we pass How do you think? |
overfit_batches
when using with multiple val/test_dataloadersoverfit_batches
when using with multiple val/test_dataloaders
This pull request is now in conflict... :( |
What does this PR do?
Fixes #2532
Fixes #2325 (I think)
Colab to reproduce issue: https://colab.research.google.com/drive/1BtQBCoP5fK-aZm_2uLMOUbf2c9cu-yFb?usp=sharing
Colab for this PR: https://colab.research.google.com/drive/1nzVh8xeEGLOJvSZ0Ih7WccNHgI1gDzaD?usp=sharing
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃