You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
So when you have an input tensor like lengths for a model that isn't used it often gets stripped out by onnx (or pytorch?). For example when you have a Classifier that uses the LSTM where the lengths are needed length will be needed in the inputs. If you are using something like a Conv Net classifier where the length is never used it will get stripped out. This means that if you send a lengths tensor you will get an error.
We normally decide what to send based on the model.assests file so we should be filter the inputs based on the ort.InputSession(...).get_inputs() and it should work out? In the onnx service we might need to use this method to filter it I'm not sure if the ONNX service ever checks the model.assests file
The text was updated successfully, but these errors were encountered:
So when you have an input tensor like
lengths
for a model that isn't used it often gets stripped out by onnx (or pytorch?). For example when you have a Classifier that uses the LSTM where the lengths are needed length will be needed in the inputs. If you are using something like a Conv Net classifier where the length is never used it will get stripped out. This means that if you send a lengths tensor you will get an error.We normally decide what to send based on the model.assests file so we should be filter the inputs based on the
ort.InputSession(...).get_inputs()
and it should work out? In the onnx service we might need to use this method to filter it I'm not sure if the ONNX service ever checks the model.assests fileThe text was updated successfully, but these errors were encountered: