-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ONNXRuntimeError] : 9 : NOT_IMPLEMENTED : Could not find an implementation for the node ArgMax_1094:ArgMax(11) #10068
Comments
Hi, can u pls check what's the type you are using for ArgMax. Likely it's not supported for opset 11. search for ArgMax for either CPU/CUDA provider: If CUDA, there's an PR in progress - #9700 |
the type using for ArgMax is int64. and only can be int64, could yuo give me some advices? |
I solve it , only change ArgMax's input to int32, everything else stays the same, and it ok!! |
To make it clearer: it's a matter of converting the input vector to int32 when converting the model, and then executing def export_text_encoder():
text_encoder.eval()
text = "A Diagram"
input_tensor: Tensor = tokenizer(text)
input_tensor = input_tensor.to(torch.int32) # <---- here!
model_text ='mobileclip-text-encoder.onnx'
torch.onnx.export(text_encoder, input_tensor, model_text) |
hi i run
onnx_session1 = onnxruntime.InferenceSession("./pretrained/textmodel.onnx")
and generate error as below:
File "D:\anaconda3\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 312, in _create_inference_session
sess.initialize_session(providers, provider_options)
onnxruntime.capi.onnxruntime_pybind11_state.NotImplemented: [ONNXRuntimeError] : 9 : NOT_IMPLEMENTED : Could not find an implementation for the node ArgMax_1094:ArgMax(11)
ONNX - 1.7.0
onnxruntime - 1.7.0
could you help me to solve this problem?
and the onnx model i transfered here
The text was updated successfully, but these errors were encountered: