Skip to content

Commit

Permalink
avoid accessing self.optimizer when lightning calls configure_optimizers
Browse files Browse the repository at this point in the history
this property wont exist when the method is called with Lightning
  • Loading branch information
sammlapp committed Sep 6, 2024
1 parent 249a8e8 commit 5b268c2
Showing 1 changed file with 6 additions and 1 deletion.
7 changes: 6 additions & 1 deletion opensoundscape/ml/cnn.py
Original file line number Diff line number Diff line change
Expand Up @@ -337,6 +337,10 @@ def configure_optimizers(
- can set to -1 to restart learning rate schedule
- can set to another value to start lr scheduler from an arbitrary position
Note: when used by lightning, self.optimizer and self.scheduler should not be modified
directly, lightning handles these internally. Lightning will call the method without
passing reset_optimizer or restart_scheduler, so default=False results in not modifying .optimizer or .scheduler
Documentation:
https://lightning.ai/docs/pytorch/stable/api/lightning.pytorch.core.LightningModule.html#lightning.pytorch.core.LightningModule.configure_optimizers
Args:
Expand All @@ -360,7 +364,7 @@ def configure_optimizers(
self.network.parameters(), **self.optimizer_params["kwargs"].copy()
)

if self.optimizer is not None:
if hasattr(self, "optimizer") and self.optimizer is not None:
# load the state dict of the previously existing optimizer,
# updating the params references to match current instance of self.network
try:
Expand All @@ -372,6 +376,7 @@ def configure_optimizers(
"attempt to load state dict of existing self.optimizer failed. "
"Optimizer will be initialized from scratch"
)
# TODO: write tests for lightning to check behavior of continued training

# create learning rate scheduler
# self.scheduler_params dictionary has "class" key and kwargs for init
Expand Down

0 comments on commit 5b268c2

Please sign in to comment.