Does converting all operations to onnx->tensorrt speed up a lot? #481
Unanswered
YoungjaeDev
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello
https://github.com/obss/sahi , This GitHub is for Tilling Inference.
I'm developing it in two directions.
I think if it were faster, there would be someone who would try like this.
Beta Was this translation helpful? Give feedback.
All reactions