-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[skip ci] Allow ModelCheckpoint in Trainer callbacks list #3990
Conversation
Codecov Report
@@ Coverage Diff @@
## master #3990 +/- ##
========================================
- Coverage 89% 42% -47%
========================================
Files 212 212
Lines 15562 15702 +140
========================================
- Hits 13867 6623 -7244
- Misses 1695 9079 +7384 |
The tests currently fail as they reveal another bug, which I will fix here: #4027 |
This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. If you need further help see our docs: https://pytorch-lightning.readthedocs.io/en/latest/CONTRIBUTING.html#pull-request or ask the assistance of a core contributor here or on Slack. Thank you for your contributions. |
Allows passing ModelCheckpoint directly to the callbacks list:
This was not possible before because save_function was not set. We now set it in the callback directly.
The following cases need to be considered:
Trainer(checkpoint_callback=True, callbacks=[ModelCheckpoint()])
Trainer(checkpoint_callback=False, callbacks=[ModelCheckpoint()])
Trainer(checkpoint_callback=ModelCheckpoint(), callbacks=[ModelCheckpoint()])
This PR ignores the
checkpoint_callback
setting if one is passed to callbacks. Does this make sense @PyTorchLightning/core-contributors ?TODO:
trainer.checkpoint_callback
, we probably need to limit it to 1 checkpoint callback for now?