How to configure optimizer that uses model parameters for further instantiation? #2501
Replies: 2 comments 2 replies
-
Hi @chekhovana, Without singleton support, referencing In the future, singleton support may be implemented. For future reference, I'll give some details on how this could be done if singleton support were available. If singleton support were available, passing in model:
_target_: timm.create_model
model_name: resnet18
pretrained: False
num_classes: 1
_singleton_: true # requires https://github.com/facebookresearch/hydra/issues/1393
optimizer:
_target_: torch.optim.AdamW
lr: 3e-4
params: ${getattr:${model},"parameters"} In this example, OmegaConf.register_new_resolver("getattr", getattr) |
Beta Was this translation helpful? Give feedback.
-
What about utilizing partial instantiation support? |
Beta Was this translation helpful? Give feedback.
-
I would like to instantiate both model and optimizer directly from config, but can't get how to pass model parameters to optimizer constructor. Below is my attempt that raises the error.
Is it possible to achieve?
Beta Was this translation helpful? Give feedback.
All reactions