Skip to content

bdiaz29/ConvertTagger2TensorRT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

To install TensorRT you need to first install Cuda, and then follow Nvidia's guide to installing CuDNN

then make sure you have python 3.10 enviroment set up and download tensorRT

once you download tensorrt you will see it from the download folder two folders bin and python these are going to be important bin\trtexec.exe is what you will use to construct trt engines and the python folder is what you will use to install tensorrt into your enviroment, find the wheel that corresponds to python 3.10 and copy that directory,(in windows you can do this by holding shift right cliking and selecting copy as path ) image

then use that to install tensorrt in your enviroment with pip install [filepath to wheel]

image

and tensorrt should be installed.

model conversion for model conversrion you need to keep note of where you installed tensorRT and point the toml file in this repo to its location

image

use the python script convert model to convert the WD tagger the script will convert it first to onnx and then convert that to to a tensorRT engine

first locate the directory of the tagger files which should look like this

image

the folder directory for that will be put into the --model_path assign a batch_size and name and run the script. in the terminal you will see a message when it is finishe converting, this can take a while.

About

convert the WD taggers into tensorRT engines

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages