-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove optimizer_connector.py
#10119
Comments
Dear @daniellepintz, This sounds good to me. @carmocca @awaelchli @Borda Do you validate ? Best, |
probably fine :)
Not convinced. The optimizers are owned by the trainer. imo it should be the responsibility of the owner to update that state. The alternative would be to create a function that takes the optimizers as inputs and updates the learning rates on them. Adding this method to the training_epoch_loop seems a bit arbitrary in the light of loop customization. If we are not sure what the design will be in the future, we could do it as an intermediate step and mark the update_learning_rates method as protected. |
It is but not any any more arbitrary than having it in a separate connector.
I agree with this. Marking this as "approved" |
I don't think it's arbitrary. First of all I am not a big fan of just marking
@awaelchli where do you envision this function living? |
I expressed arbitrary because the update of learning rates could be called in any other loop, a custom loop. If the update can be called in any other loop then the choice of putting the method in the training epoch loop is arbitrary as it could be in any other place. Lightning calls the method in that loop but that's a choice based on a "standard" loop we expect. Loop customization is supposed to enable us to change that. Whenever we make changes to the loops I want to think about how this serves the idea of why we made the loops customizable. It's not in a great state but much better than before and we can work towards it.
The _update_learning_rates would just be one additional method on the TrainingEpochLoop and once deprecation ends, the connector can be removed. If we can't agree on what will happen with that functionality, imo it's better to just not expose it to the user. That has btw been the main argumentation of FB/@ananthsub for many many other parts of the code base. |
optimizer_connector.py
optimizer_connector.py
I meant to title this as "remove", not "deprecate". since this is an internal implementation detail we shouldn't need a deprecation cycle? |
I'm confused - are you saying you're ok with adding _update_learning_rates to TrainingEpochLoop? |
totally agree with making it protected, yes. |
Proposed refactoring or deprecation
Move
on_trainer_init
function totrainer.py
https://github.com/PyTorchLightning/pytorch-lightning/blob/c9bc10ce8473a2249ffa4e00972c0c3c1d2641c4/pytorch_lightning/trainer/connectors/optimizer_connector.py#L26
Move
update_learning_rates
function totraining_epoch_loop.py
https://github.com/PyTorchLightning/pytorch-lightning/blob/c9bc10ce8473a2249ffa4e00972c0c3c1d2641c4/pytorch_lightning/trainer/connectors/optimizer_connector.py#L31
Motivation
We are auditing the connectors, and
optimizer_connector.py
can easily be removed since it only has one significant function which is used in one place (training_epoch_loop.py
)The text was updated successfully, but these errors were encountered: