You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using multiple optimizers, in the toggle_optimizers(..) function, the requires_grad property is set to true for all the params belonging to the param_groups of the optimizer. This is incorrect as in case the user has explicitly disabled the requires_grad property for some parameters permanently, the function would enable requires_grad for those parameters also.
Proposed fix:
The requires_grad value for all the parameters should be stored beforehand in an object variable self.params_dict of lightning object. In the toggle_optimizer() function, instead of setting params.requires_grad=True, set params.requires_grad = self.params_dict[params].This would set the correct value for the params of the concerned optimizer.
The text was updated successfully, but these errors were encountered:
https://github.com/PyTorchLightning/pytorch-lightning/blob/dabfeca92e0702e55f09ac53e9412672cd258cd3/pytorch_lightning/core/lightning.py#L1152-L1170
Setup :
pytorch-lightning 1.1.2
pytorch 1.7.1
When using multiple optimizers, in the
toggle_optimizers(..)
function, the requires_grad property is set to true for all the params belonging to the param_groups of the optimizer. This is incorrect as in case the user has explicitly disabled therequires_grad
property for some parameters permanently, the function would enablerequires_grad
for those parameters also.Proposed fix:
The requires_grad value for all the parameters should be stored beforehand in an object variable
self.params_dict
of lightning object. In thetoggle_optimizer()
function, instead of settingparams.requires_grad=True
, setparams.requires_grad = self.params_dict[params]
.This would set the correct value for the params of the concerned optimizer.The text was updated successfully, but these errors were encountered: