You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It should be possible to temporarily suspend one optimizer.
Motivation
When training a GAN, if the generator or discriminator becomes too good, one might want to suspend that module's optimizer until the other module catches up.
Pitch
There should be a simple clean way to disable and re-enable a particular optimizer, so that its training steps are completely skipped until they are re-enabled.
Alternatives
Tried returning a simple loss, but got an error because the gradient couldn't backprop.
Tried disabling that module's gradients, but also got an error.
Considered zero'ing the gradients manually, but that seems really fiddly and gross.
Additional context
In general, allowing finer control over how multiple optimizers works would be beneficial in many settings.
The text was updated successfully, but these errors were encountered:
🚀 Feature
It should be possible to temporarily suspend one optimizer.
Motivation
When training a GAN, if the generator or discriminator becomes too good, one might want to suspend that module's optimizer until the other module catches up.
Pitch
There should be a simple clean way to disable and re-enable a particular optimizer, so that its training steps are completely skipped until they are re-enabled.
Alternatives
Additional context
In general, allowing finer control over how multiple optimizers works would be beneficial in many settings.
The text was updated successfully, but these errors were encountered: