[BUG] AttributeError: 'PatchEmbed' object has no attribute 'dynamic_img_pad' #1940
Replies: 8 comments
-
@byunsunyoung you'll have to specify what 'using it as usual' means as it was tested and works fine for me, are you serializing the full model instead of the state dict in saved checkpoints? you cannot do that across versions reliably. |
Beta Was this translation helpful? Give feedback.
-
I've been using the beit model since 2 weeks ago, and I've been getting errors since I restarted the server today. |
Beta Was this translation helpful? Give feedback.
-
For your information, I succeeded "create_model" method, and this error occurs during fine-tuning process using pytorch. |
Beta Was this translation helpful? Give feedback.
-
Cell In[45], line 3 Cell In[44], line 37, in train_model(model, dataloaders, criterion, optimizer, num_epochs) File /opt/conda/lib/python3.8/site-packages/torch/nn/modules/module.py:1501, in Module._call_impl(self, *args, **kwargs) File /opt/conda/lib/python3.8/site-packages/timm/models/beit.py:427, in Beit.forward(self, x) File /opt/conda/lib/python3.8/site-packages/timm/models/beit.py:404, in Beit.forward_features(self, x) File /opt/conda/lib/python3.8/site-packages/torch/nn/modules/module.py:1501, in Module._call_impl(self, *args, **kwargs) File /opt/conda/lib/python3.8/site-packages/timm/layers/patch_embed.py:83, in PatchEmbed.forward(self, x) File /opt/conda/lib/python3.8/site-packages/torch/nn/modules/module.py:1614, in Module.getattr(self, name) AttributeError: 'PatchEmbed' object has no attribute 'dynamic_img_pad' |
Beta Was this translation helpful? Give feedback.
-
That attribute was added in 0.9.6, but it's disabled by default and wouldn't intefere with training of beit (it's not currently used by beit). I can fine-tune that model w/o issues, see example below... Are you sure your environment is in a good state? You don't have conflicting timm versions, this error should only happen if the model has the attributes of a previous version of timm, but is using the code for the new version somehow. Usually happens when people save their model checkpoints by doing torch.save(model) instead of torch.save(model.state_dict()) and then restore across changes in the underlying code...
|
Beta Was this translation helpful? Give feedback.
-
I've been using "torch.save(model)" to save the model, and there's been no problem. This is the first time there's been a problem. |
Beta Was this translation helpful? Give feedback.
-
@byunsunyoung I'd say that |
Beta Was this translation helpful? Give feedback.
-
I will move this to discussions in case anyone else runs into it, it's happened with other small layer attribute changes in the past because nuances between the two ways of saving aren't always obvious... |
Beta Was this translation helpful? Give feedback.
-
I am using "beitv2_base_patch16_224", "beitv2_large_patch16_224".
I used it as usual, but I get an error. AttributeError: 'PatchEmbed' object has no attribute 'dynamic_img_pad'
Please fix the bug.
Beta Was this translation helpful? Give feedback.
All reactions