Skip to content

Latest commit

 

History

History
64 lines (59 loc) · 4.86 KB

README.md

File metadata and controls

64 lines (59 loc) · 4.86 KB

NXP NNStreamer examples

Purpose of this repository is to provide, for demonstration purpose, functional examples of GStreamer/NNStreamer-based pipelines optimized and validated for some designated NXP i.MX application processors.

How to run examples

Models and metadata download

Models and metadata files used by examples are not archived in this repository. Therefore they have to be downloaded over the network, prior to execution of the examples on the target. Download of those files is to be done from the host PC, running Jupyter Notebook download.ipynb. Refer to download instruction for details.

Execution on target

Python

Once models have been fetched locally on host PC, repository will contain both examples and downloaded artifacts. Thus it can be uploaded to the target board for individual examples execution. Complete repository can either be uploaded from host PC to target using regular scp command or only the necessary directories using upload.sh script provided for host:

# replace <target ip address> by relevant value
$ cd /path/to/nxp-nnstreamer-examples
$ ./tools/upload.sh root@<target ip address>

C++ with cross-compilation

1- Fetch the models locally on host PC.
2- Build a Yocto BSP SDK for the dedicated i.MX platform. Refer to the imx-manifest to setup the correct building environment, SDK needs to be compiled with bitbake using imx-image-full and populate_sdk command as followed :

# Build the image recipe imx-image-full
$ bitbake imx-image-full -c populate_sdk

3- The SDK environment setup script located in /path/to/yocto/bld-xwayland/tmp/deploy/sdk/ must be executed before being able to source its environment.
4- Source the SDK environment and compile C++ examples with CMake, then push the required artifacts in its expected folder on the board (the scp command can be used for this purpose).
Note: path to the folder containing the data can be changed in CMakeLists.txt file as well as model and label names in the cpp example source file. Example :

# Source the SDK (installed by default in /opt/fsl-imx-xwayland/<LF version>/)
$ . /path/to/sdk/environment-setup-armv8-poky-linux
# Cross-compile examples
$ cd /path/to/nxp-nnstreamer-examples
$ mkdir build && cd $_
$ cmake ..
$ make
# Send classification example to target, replacing <target ip address> by relevant value
$ scp ./classification/example_classification_mobilenet_v1_tflite root@<target ip address>

C++ examples were developped using a custom C++ library. A description of how to use the library can be found here.

Compile models on target

Quantized TFLite models must be compiled with vela for i.MX 93 Ethos-U NPU. This can be done directly on the target :

# To do directly on target
$ cd /path/to/nxp-nnstreamer-examples
$ ./downloads/compile_models.sh

Examples can then be run directly on the target. More information on individual examples execution is available in relevant sections.

Categories

Note that examples may not run on all platforms - check table below for platform compatibility.

Snapshot Name Platforms Features
object detection demo Object detection i.MX 8M Plus
i.MX 93
i.MX 95
MobileNet SSD
Yolov4-tiny
TFLite
v4l2 camera
gst-launch
custom python tensor_filter
image classification demo Image classification i.MX 8M Plus
i.MX 93
i.MX 95
MobileNet
TFLite
v4l2 camera
gst-launch
image segmentation demo Image segmentation i.MX 8M Plus
i.MX 93
DeepLab
TFLite
jpeg files slideshow
gst-launch
pose detection demo Pose detection i.MX 8M Plus
i.MX 93
MoveNet
TFLite
video file decoding (i.MX 8M Plus only)
v4l2 camera
gst-launch
python
faces demo Face i.MX 8M Plus
i.MX 93
UltraFace
FaceNet512
Deepface-emotion
TFLite
v4l2 camera
python
mixed demo Mixed i.MX 8M Plus
i.MX 93
MobileNet SSD
MobileNet
Movenet
UltraFace
Deepface-emotion
TFLite
v4l2 camera
C++
custom C++ decoding
depth demo Depth i.MX 8M Plus
i.MX 93
Midas v2
TFLite
v4l2 camera
C++
custom C++ decoding

Images and video used have been released under Creative Commons CC0 1.0 license or belong to Public Domain