You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Based on the example given in geffnet (which works for efficientnet), I try to export a timm model to ONNX (tf_efficientnetv2_b2) as follows:
model = timm.create_model(model,
pretrained=False,
checkpoint_path=checkpoint_path,
exportable=True,
num_classes=8)
model.eval()
model.to(DEVICE)
# B x C x H x W
example_input = torch.randn(EXPORT_BATCH_SIZE, 3,
32*config['train_width'], 32*config['train_width']).to(DEVICE)
# Run model once before export trace
model(example_input)
# https://pytorch.org/tutorials/advanced/super_resolution_with_onnxruntime.html
input_names = ["input0"]
output_names = ["output0"]
if dynamic_size:
dynamic_axes = {'input0': {0: 'batch'}, 'output0': {0: 'batch'}}
dynamic_axes['input0'][2] = 'height'
dynamic_axes['input0'][3] = 'width'
else:
dynamic_axes = {}
if aten_fallback:
export_type = torch.onnx.OperatorExportTypes.ONNX_ATEN_FALLBACK
else:
export_type = torch.onnx.OperatorExportTypes.ONNX
torch.onnx.export(model, example_input, onnx_name,
do_constant_folding=True, input_names=input_names,
output_names=output_names, dynamic_axes=dynamic_axes)
but I get the following error:
~/.pyenv/versions/3.8.8/envs/classifier/lib/python3.8/site-packages/torch/onnx/symbolic_helper.py in _onnx_opset_unsupported_detailed(op_name, current_opset, supported_opset, reason)
231
232 def _onnx_opset_unsupported_detailed(op_name, current_opset, supported_opset, reason):
--> 233 raise RuntimeError('Unsupported: ONNX export of {} in '
234 'opset {}. {}. Please try opset version {}.'.format(op_name, current_opset, reason, supported_opset))
235
RuntimeError: Unsupported: ONNX export of Pad in opset 9. The sizes of the padding must be constant. Please try opset version 11.
I understand that exporting timm to onnx isn't documented or supported but... I guess my questions are:
Is it likely that a soon-to-be-released efficientnetv2 will not run into this problem of non-constant padding? and therefore be more likely to be exportable?
If not, how difficult would it be to edit Timm to support this export functionality? at least for this model.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Based on the example given in geffnet (which works for efficientnet), I try to export a timm model to ONNX (tf_efficientnetv2_b2) as follows:
but I get the following error:
I understand that exporting timm to onnx isn't documented or supported but... I guess my questions are:
Is it likely that a soon-to-be-released efficientnetv2 will not run into this problem of non-constant padding? and therefore be more likely to be exportable?
If not, how difficult would it be to edit Timm to support this export functionality? at least for this model.
Beta Was this translation helpful? Give feedback.
All reactions