You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
My current plan is to put the core code for ORT inference into the directory yolort/runtime. And furthermore we could name the python scripts as ort_modeling.py, you can also use others if you have a better suggestion. We plan to add more python inferencing backends in the future in this directory. And then add a CLI tool named run_model.py (just for example) to the directory tools for use cases.
🚀 The feature
Add ONNXRuntime Python interface.
Motivation, pitch
My current plan is to put the core code for ORT inference into the directory
yolort/runtime
. And furthermore we could name the python scripts asort_modeling.py
, you can also use others if you have a better suggestion. We plan to add more python inferencing backends in the future in this directory. And then add a CLI tool namedrun_model.py
(just for example) to the directorytools
for use cases.cc @itsnine
Originally posted by @zhiqwang in #176 (comment)
The text was updated successfully, but these errors were encountered: