-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
warmup_lr
is computed incorrectly in step_ReduceLROnPlateau
#19
Comments
Hi,how do you use ReduceLROnPlateau as the after_scheduler,when I use this as after_scheduler,the warm up in the first epoch is not change. |
|
Hi,I had replace this line
|
You can re-refer it, I updated it just now. I think the main cause is the |
Thanks!I will test it. |
Hi,I test this methods,but this has same problem,the first epochs lr is also 5e-4.But,when I change the after_scheduler to cosin_scheduler,the running is right. |
Hi there! Still do not work with |
I wonder whether you forgot to modify like the line shown below in:
pytorch-gradual-warmup-lr/warmup_scheduler/scheduler.py
Line 44 in 6b5e895
Here is the details:
ReduceLROnPlateau
as theafter_scheduler
ofGradualWarmupScheduler
, the warm-up failed. The way I get the learning rate is:optim.param_groups[0]['lr']
. Then I useget_lr
to get the learning rate, I found it is correct.StepLR
as theafter_scheduler
, I found there was no exception and no error.Therefor, I think the learning rate of the optimizer hadn't been warmed up correctly.
The text was updated successfully, but these errors were encountered: