You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for catching this. This portion was recently modified and I must've missed testing the debug case. I have updated the code and respective tests to account for this - #97 . Hope this resolves it for you!
FWIW - since debug mode prevents writing checkpoints, etc. to disk, the model trained by will load the parameters of the last epoch rather than the best epoch (that would have been saved to disk if debug=False).
Thanks for catching this. This portion was recently modified and I must've missed testing the debug case. I have updated the code and respective tests to account for this - #97 . Hope this resolves it for you!
FWIW - since debug mode prevents writing checkpoints, etc. to disk, the model trained by will load the parameters of the last epoch rather than the best epoch (that would have been saved to disk if debug=False).
Thanks again! This helps me a lot. I will download the new code and the issue can be closed.
Dear Developers and Users,
When I set the debug to true in the config, the training worked well but an error occurred during the prediction.
After checking the source code, I found the following lines gave rise to this issue.
amptorch/amptorch/trainer.py
Line 128 in bd8af57
I don't understand why
self.config["dataset"]["descriptor"]
is only set whenif not self.debug
? Should this be set no matter whether debug is on?Many thanks,
Jiayan
The text was updated successfully, but these errors were encountered: