Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Temporarily suspend one optimizer #7198

Closed
turian opened this issue Apr 23, 2021 · 2 comments
Closed

Temporarily suspend one optimizer #7198

turian opened this issue Apr 23, 2021 · 2 comments
Labels
feature Is an improvement or enhancement help wanted Open to be worked on

Comments

@turian
Copy link
Contributor

turian commented Apr 23, 2021

🚀 Feature

It should be possible to temporarily suspend one optimizer.

Motivation

When training a GAN, if the generator or discriminator becomes too good, one might want to suspend that module's optimizer until the other module catches up.

Pitch

There should be a simple clean way to disable and re-enable a particular optimizer, so that its training steps are completely skipped until they are re-enabled.

Alternatives

  • Tried returning a simple loss, but got an error because the gradient couldn't backprop.
  • Tried disabling that module's gradients, but also got an error.
  • Considered zero'ing the gradients manually, but that seems really fiddly and gross.

Additional context

In general, allowing finer control over how multiple optimizers works would be beneficial in many settings.

@turian turian added feature Is an improvement or enhancement help wanted Open to be worked on labels Apr 23, 2021
@ananthsub
Copy link
Contributor

@turian https://pytorch-lightning.readthedocs.io/en/latest/common/optimizers.html#manual-optimization should offer finer control over optimizer steps - does this work for your use case?

@turian
Copy link
Contributor Author

turian commented Apr 23, 2021

@ananthsub Thanks. It also appears that if I return loss = None it toggles the optimizer: #5983

@turian turian closed this as completed Apr 23, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement help wanted Open to be worked on
Projects
None yet
Development

No branches or pull requests

2 participants