Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix user warning produced by apex + scheduler combination #1873

Merged

Conversation

SkafteNicki
Copy link
Member

@SkafteNicki SkafteNicki commented May 18, 2020

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

What does this PR do?

Fixes #841

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@mergify mergify bot requested a review from a team May 18, 2020 13:07
@Borda Borda added the bug Something isn't working label May 18, 2020
@Borda Borda added this to the 0.7.7 milestone May 18, 2020
@mergify mergify bot requested a review from a team May 18, 2020 14:54
Copy link
Member

@justusschock justusschock left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a small question. Besides that it looks good to me :)

pytorch_lightning/trainer/distrib_data_parallel.py Outdated Show resolved Hide resolved
@mergify mergify bot requested a review from a team May 19, 2020 05:59
@Borda Borda added the ready PRs ready to be merged label May 21, 2020
@williamFalcon williamFalcon merged commit 8f6b7a2 into Lightning-AI:master May 22, 2020
@SkafteNicki SkafteNicki deleted the bugfix/apex_scheduler_warning branch May 25, 2020 08:20
@Borda Borda modified the milestones: 0.7.7, 0.8.0 May 26, 2020
@s-rog
Copy link
Contributor

s-rog commented Mar 11, 2021

/opt/conda/lib/python3.8/site-packages/torch/optim/lr_scheduler.py:122: UserWarning: Seems like `optimizer.step()` has been overridden after learning rate scheduler initialization. Please, make sure to call `optimizer.step()` before `lr_scheduler.step()`. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate

Getting the same warning using native amp on the latest release (1.2.3). Did this PR remove the warning as well, or just make it so that it can be safely ignored?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Scheduler should be initialized after Apex is configured
5 participants