-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
auto learning rate finder fails AFTER optimization due to misconfiguration #5487
Comments
good catch! Mind send a PR with a fix? |
I actually never contributed to open source before :) How does this work? I get assigned and there's a deadline? I want to do it! |
We have a basic guide here! There's no deadline! Any contribution is welcome 😄 |
Can the same issue be done twice by some people w/o them knowing about it until the latter does a PR? |
Yes. For this reason, you might want to create the PR in draft mode while you are working on it, so others can know about it https://github.blog/2019-02-14-introducing-draft-pull-requests/ |
Thanks for contributing @noamzilo! I have assigned this ticket to you, so other contributors know that this is already in progress. please let us know if you need any help! |
How to link this post to the PR? currently my test doesn't pass due to lack of
Please allow this library, or advise how I should limit test time. Thanks :) |
🐛 Bug
Optimizing learning rate using auto_lr_find=True fails AFTER a learning rate was found with error
pytorch_lightning.utilities.exceptions.MisconfigurationException: When auto_lr_find is set to True, expects that
modelor
model.hparamseither has field
lror
learning_ratethat can overridden
AFTER having waited for several minutes.
This can be caught BEFORE the long run to save time.
Please reproduce using the BoringModel
I can just copy paste everything to here... the only lacking line is
self.lr = 999
undermodel::__init__()
, if it is not present the error happens, otherwise it doesn't happen.Reproducing this a single time takes me 15 minutes (due to the lr finding process), but I still wanted to report this because it seems important and I don't have time right now for a full reproduction on the template.
trainer = Trainer(auto_lr_find=True) trainer.tune(model)
then for a while I see the progress bar
Finding best initial lr: 27%|██▋ | 27/100 [02:43<07:37, 6.27s/it]
and then the error after a lr has been found.
Expected behavior
pytorch_lightning.utilities.exceptions.MisconfigurationException: When auto_lr_find is set to True, expects that
modelor
model.hparamseither has field
lror
learning_ratethat can overridden
should hit BEFORE the long calculation.
Environment
conda
,pip
, source): pipThe text was updated successfully, but these errors were encountered: