Skip to content

alwaysai/tensorrt-hacky-hour

Repository files navigation

TensorRT Applications

NV_TensorRT_Visual_2C_RGB-625x625-1

This alwaysAI applications set uses TensorRT binaries to do the local inferencing on a NVIDIA Jetson device, these binaries can be found in the alwaysAI model catalog. The model will start with TRT and end with the Jetson device name it should be run on, for example nano. These binaries are the most efficient way to do inferencing on NVIDIA Jetson device. Currently alwaysAI supports TensorRT binaries for Jetson Nano, Xavier NX, and AGX Xavier.

TRT-FLOW

Repo Programs

Folder Description
license-vehicle-detection Program detects vehicles and license plates, currently setup to work with Jetson Nano
distance-between-hands Program detects distance between hands in meters, this application needs an Intel RealSense camera to run and is setup to work with Jetson Nano

Requirements

Usage

Once the alwaysAI tools are installed on your development machine (or edge device if developing directly on it) you can run the following CLI commands:

To perform initial configuration of the app:

aai app configure

If you're running on a Jetson device other than a Nano, add the model from the catalog and update your app. For example, for an Xavier NX:

aai app models add alwaysai/TRT_ssd_mobilenet_v1_coco_vehicle_license_xavier

Then update app.py to reflect the new model ID.

To prepare the runtime environment and install app dependencies:

aai app install

To start the app:

aai app start

Support

About

Applications for TensorRT Hacky Hour

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published