-
Notifications
You must be signed in to change notification settings - Fork 62
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
trtexec fails to build /mobile_sam_mask_decoder.onnx #16
Comments
I had the same problem. |
Unfortunately I dont have a solution for it yet. |
Bump... Same issue - any help would be much appreciated; thanks! |
Two possible workarounds (use either one):
You might also need to install The resulted ONNX files obtained by above have no More detailsUsing Netron to inspect the file, the ONNX file converted by the following command has python3 -m nanosam.tools.export_sam_mask_decoder_onnx --model-type=vit_t --checkpoint=assets/mobile_sam.pt --output=/mnt/e/data/mobile_sam_mask_decoder.onnx However, the ONNX file provided by the google drive link in README.md does not have |
Two possible workarounds (use either one):
|
Awsome!Could you tell why it woks? |
This is the changed part of the above gist code (https://gist.github.com/binh234/2bb4fb5be3066460825786ba7d46c55c#file-mask_decoder-py-L126):
@binh234 helps to change This PR in pytorch ([ONNX] Simplify repeat_intereleave export for scalar-valued 'repeat' It's merged on May 6, 2023, starting to change the onnx-exporting behavior from pytorch version PyTorch 2.1.0. Please check here: https://github.com/pytorch/pytorch/blame/v2.1.0-rc1/torch/onnx/symbolic_helper.py |
Trying to run:
trtexec --onnx=data/mobile_sam_mask_decoder.onnx --saveEngine=data/mobile_sam_mask_decoder.engine --minShapes=point_coords:1x1x2,point_labels:1x1 --optShapes=point_coords:1x1x2,point_labels:1x1 --maxShapes=point_coords:1x10x2,point_labels:1x10
after successfully exporting mobile_sam_mask_decoder.onnx with:
python3 -m nanosam.tools.export_sam_mask_decoder_onnx --model-type=vit_t --checkpoint=assets/mobile_sam.pt --output=/mnt/e/data/mobile_sam_mask_decoder.onnx
resulting in this error:
onnx2trt_utils.cpp:374: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[12/18/2023-11:39:43] [E] Error[4]: [graph.cpp::symbolicExecute::539] Error Code 4: Internal Error (/OneHot: an IIOneHotLayer cannot be used to compute a shape tensor)
[12/18/2023-11:39:43] [E] [TRT] ModelImporter.cpp:771: While parsing node number 146 [Tile -> "/Tile_output_0"]:
[12/18/2023-11:39:43] [E] [TRT] ModelImporter.cpp:772: --- Begin node ---
[12/18/2023-11:39:43] [E] [TRT] ModelImporter.cpp:773: input: "/Unsqueeze_3_output_0"
input: "/Reshape_2_output_0"
output: "/Tile_output_0"
name: "/Tile"
op_type: "Tile"
[12/18/2023-11:39:43] [E] [TRT] ModelImporter.cpp:774: --- End node ---
[12/18/2023-11:39:43] [E] [TRT] ModelImporter.cpp:777: ERROR: ModelImporter.cpp:195 In function parseGraph:
[6] Invalid Node - /Tile
[graph.cpp::symbolicExecute::539] Error Code 4: Internal Error (/OneHot: an IIOneHotLayer cannot be used to compute a shape tensor)
[12/18/2023-11:39:43] [E] Failed to parse onnx file
[12/18/2023-11:39:43] [I] Finished parsing network model. Parse time: 0.32614
[12/18/2023-11:39:43] [E] Parsing model failed
[12/18/2023-11:39:43] [E] Failed to create engine from model or file.
[12/18/2023-11:39:43] [E] Engine set up failed
The text was updated successfully, but these errors were encountered: