Skip to content

TensorRT model not loading #6453

Answered by dyastremsky
Raulval82 asked this question in Q&A
Discussion options

You must be logged in to vote

Quoting TensorRT documentation: "By default, TensorRT engines are compatible only with the version of TensorRT with which they are built."

23.09 added version-compatibility support. To use this feature with a 23.09 or later container, you need to generate the models to be version-compatible and pass in the version-compatible backend flag to Triton. Examples of how to do this exist in the related server pull request.

You would need to look at the rest of your logs (verbose logging could be helpful here) to confirm the issue is a version mismatch. If so, the most surefire way to fix this is to generate the TensorRT model inside of the NGC TensorRT container of the same version as Triton (e.…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by dyastremsky
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants