You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Make sure that all attributes (not just LR) are in sync in between the OSS.param_groups and the actual optimizer.
Motivation
Some frameworks make it possible to alter any attribute here, not just LR (momentum, beta for Adam, ..). We do not currently support this and silently fail
Pitch
Part of the not really well defined PyTorch optimizer features which are nice to have
Alternatives
At least add a warning when we're out of sync
Nuke the .param_groups access
Force a new OSS optimizer to be created everytime the user wants to change an attribute in param_groups
Additional context
The text was updated successfully, but these errors were encountered:
🚀 Feature
Make sure that all attributes (not just LR) are in sync in between the OSS.param_groups and the actual optimizer.
Motivation
Some frameworks make it possible to alter any attribute here, not just LR (momentum, beta for Adam, ..). We do not currently support this and silently fail
Pitch
Part of the not really well defined PyTorch optimizer features which are nice to have
Alternatives
Additional context
The text was updated successfully, but these errors were encountered: