Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

swaps lr sched order #2356

Merged
merged 3 commits into from
Jun 25, 2020
Merged

swaps lr sched order #2356

merged 3 commits into from
Jun 25, 2020

Conversation

williamFalcon
Copy link
Contributor

Fixes #2330
Fixes #2078

@mergify mergify bot requested a review from a team June 25, 2020 02:42
@williamFalcon williamFalcon added bug Something isn't working ready PRs ready to be merged labels Jun 25, 2020
@SkafteNicki
Copy link
Member

#2330 seems solved by this.

To solve #2078 we need to implement the solution in this comment (to keep the state of the schedulers the same after we have reinitialized the optimizer.

@pep8speaks
Copy link

pep8speaks commented Jun 25, 2020

Hello @williamFalcon! Thanks for updating this PR.

Line 123:29: W503 line break before binary operator

Comment last updated at 2020-06-25 11:48:50 UTC

@williamFalcon
Copy link
Contributor Author

how's this now @SkafteNicki

Copy link
Member

@SkafteNicki SkafteNicki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@mergify mergify bot requested a review from a team June 25, 2020 12:30
@williamFalcon williamFalcon merged commit c275e1f into master Jun 25, 2020
@Borda Borda deleted the sched branch June 25, 2020 13:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

use_amp and multiple optimizers bug Trainer(precision=16) fails with optim.lr_scheduler.ReduceLROnPlateau
3 participants