Not saving checkpoint when monitor is None and save_top_k is -1 #6096
Labels
bug
Something isn't working
checkpointing
Related to checkpointing
help wanted
Open to be worked on
priority: 0
High priority task
Milestone
🐛 Bug
When monitor is None, current will be None here
https://github.com/PyTorchLightning/pytorch-lightning/blob/6bc4490d01aed21c2d52f884d4afbeaa24a47ca0/pytorch_lightning/callbacks/model_checkpoint.py#L553
And check_monitor_top_k will return False because of that:
https://github.com/PyTorchLightning/pytorch-lightning/blob/6bc4490d01aed21c2d52f884d4afbeaa24a47ca0/pytorch_lightning/callbacks/model_checkpoint.py#L340
_update_best_and_save also doesn't take None current. raise error here: https://github.com/PyTorchLightning/pytorch-lightning/blob/6bc4490d01aed21c2d52f884d4afbeaa24a47ca0/pytorch_lightning/callbacks/model_checkpoint.py#L605
Currently, checkpointing is associated with validation, which is not necessarily always the case. I just want to save the checkpoint every k iterations.
Please reproduce using the BoringModel
Sorry, no time to reproduce for now.
To Reproduce
Use following BoringModel and post here
Expected behavior
Should save the checkpoint always if save_top_k == -1.
Environment
Note:
Bugs with code
are solved faster !Colab Notebook
should be madepublic
!IDE
: Please, use our python bug_report_model.py template.Colab Notebook
: Please copy and paste the output from our environment collection script (or fill out the checklist below manually).You can get the script and run it with:
conda
,pip
, source):Additional context
The text was updated successfully, but these errors were encountered: