:maxdepth: 1
:hidden:
3D Human Pose Estimation Python Demo <omz_demos_human_pose_estimation_3d_demo_python>
3D Segmentation Python Demo <omz_demos_3d_segmentation_demo_python>
Action Recognition Python Demo <omz_demos_action_recognition_demo_python>
G-API Background Subtraction Demo <omz_demos_background_subtraction_demo_cpp_gapi>
Background subtraction Python Demo <omz_demos_background_subtraction_demo_python>
BERT Named Entity Recognition Python Demo <omz_demos_bert_named_entity_recognition_demo_python>
BERT Question Answering Embedding Python Demo <omz_demos_bert_question_answering_embedding_demo_python>
BERT Question Answering Python Demo <omz_demos_bert_question_answering_demo_python>
Classification Benchmark C++ Demo <omz_demos_classification_benchmark_demo_cpp>
Classification Benchmark C++ G-API Demo <omz_demos_classification_benchmark_demo_cpp_gapi>
Classification Python Demo <omz_demos_classification_demo_python>
Colorization Demo <omz_demos_colorization_demo_python>
Crossroad Camera C++ Demo <omz_demos_crossroad_camera_demo_cpp>
Face Recognition Python Demo <omz_demos_face_recognition_demo_python>
Formula Recognition Python Demo <omz_demos_formula_recognition_demo_python>
G-API Gaze Estimation Demo <omz_demos_gaze_estimation_demo_cpp_gapi>
G-API Interactive Face Detection Demo <omz_demos_interactive_face_detection_demo_cpp_gapi>
Gaze Estimation Demo <omz_demos_gaze_estimation_demo_cpp>
G-API Gesture Recognition Demo <omz_demos_gesture_recognition_demo_cpp_gapi>
Gesture Recognition Python Demo <omz_demos_gesture_recognition_demo_python>
GPT-2 Text Prediction Python Demo <omz_demos_gpt2_text_prediction_demo_python>
Handwritten Text Recognition Demo <omz_demos_handwritten_text_recognition_demo_python>
Human Pose Estimation C++ Demo <omz_demos_human_pose_estimation_demo_cpp>
Human Pose Estimation Python Demo <omz_demos_human_pose_estimation_demo_python>
Image Inpainting Python Demo <omz_demos_image_inpainting_demo_python>
Image Processing C++ Demo <omz_demos_image_processing_demo_cpp>
Image Retrieval Python Demo <omz_demos_image_retrieval_demo_python>
Image Segmentation C++ Demo <omz_demos_segmentation_demo_cpp>
Image Segmentation Python Demo <omz_demos_segmentation_demo_python>
Image Translation Demo <omz_demos_image_translation_demo_python>
Instance Segmentation Python Demo <omz_demos_instance_segmentation_demo_python>
Interactive Face Detection C++ Demo <omz_demos_interactive_face_detection_demo_cpp>
Machine Translation Python Demo <omz_demos_machine_translation_demo_python>
MonoDepth Python Demo <omz_demos_monodepth_demo_python>
MRI Reconstruction C++ Demo <omz_demos_mri_reconstruction_demo_cpp>
MRI Reconstruction Python Demo <omz_demos_mri_reconstruction_demo_python>
Multi Camera Multi Target Python Demo <omz_demos_multi_camera_multi_target_tracking_demo_python>
Multi-Channel Face Detection C++ Demo <omz_demos_multi_channel_face_detection_demo_cpp>
Multi-Channel Human Pose Estimation C++ Demo <omz_demos_multi_channel_human_pose_estimation_demo_cpp>
Multi-Channel Object Detection Yolov3 C++ Demo <omz_demos_multi_channel_object_detection_demo_yolov3_cpp>
Noise Suppression C++ Demo <omz_demos_noise_suppression_demo_cpp>
Noise Suppression Python Demo <omz_demos_noise_suppression_demo_python>
Object Detection C++ Demo <omz_demos_object_detection_demo_cpp>
Object Detection Python Demo <omz_demos_object_detection_demo_python>
Pedestrian Tracker C++ Demo <omz_demos_pedestrian_tracker_demo_cpp>
Place Recognition Python Demo <omz_demos_place_recognition_demo_python>
Security Barrier Camera C++ Demo <omz_demos_security_barrier_camera_demo_cpp>
Single Human Pose Estimation Demo (top-down pipeline) <omz_demos_single_human_pose_estimation_demo_python>
Smartlab Python Demo <omz_demos_smartlab_demo_python>
Smart Classroom C++ Demo <omz_demos_smart_classroom_demo_cpp>
Smart Classroom C++ G-API Demo <omz_demos_smart_classroom_demo_cpp_gapi>
Social Distance C++ Demo <omz_demos_social_distance_demo_cpp>
Sound Classification Python Demo <omz_demos_sound_classification_demo_python>
Speech Recognition DeepSpeech Python Demo <omz_demos_speech_recognition_deepspeech_demo_python>
Speech Recognition QuartzNet Python Demo <omz_demos_speech_recognition_quartznet_demo_python>
Speech Recognition Wav2Vec Python Demo <omz_demos_speech_recognition_wav2vec_demo_python>
TensorFlow Object Detection Mask R-CNNs Segmentation C++ Demo <omz_demos_mask_rcnn_demo_cpp>
Text Detection C++ Demo <omz_demos_text_detection_demo_cpp>
Text Spotting Python Demo <omz_demos_text_spotting_demo_python>
Text-to-speech Python Demo <omz_demos_text_to_speech_demo_python>
Time Series Forecasting Python Demo <omz_demos_time_series_forecasting_demo_python>
Whiteboard Inpainting Python Demo <omz_demos_whiteboard_inpainting_demo_python>
Open Model Zoo demos are console applications that provide templates to help implement specific deep learning inference scenarios. These applications show how to preprocess and postrpocess data for model inference and organize processing pipelines. Some pipelines collect analysis data from several models being inferred simultaneously. For example, detecting a person in a video stream along with detecting the person's physical attributes, such as age, gender, and emotional state.
Source code of the demos can be obtained from the Open Model Zoo GitHub repository.
git clone --recurse-submodules https://github.com/openvinotoolkit/open_model_zoo.git
C++, C++ G-API and Python* versions are located in the cpp
, cpp_gapi
and python
subdirectories respectively.
The Open Model Zoo includes the following demos:
- 3D Human Pose Estimation Python* Demo - 3D human pose estimation demo.
- 3D Segmentation Python* Demo - Segmentation demo segments 3D images using 3D convolutional networks.
- Action Recognition Python* Demo - Demo application for Action Recognition algorithm, which classifies actions that are being performed on input video.
- Background Subtraction Python* Demo - Background subtraction using instance segmentation based models.
- Background Subtraction C++ G-API* Demo - Background subtraction G-API version.
- BERT Named Entity Recognition Python* Demo - NER Demo application that uses a CONLL2003-tuned BERT model for inference.
- BERT Question Answering Python* Demo
- BERT Question Answering Embedding Python* Demo - The demo demonstrates how to run BERT based models for question answering task.
- Classification Python* Demo - Shows an example of using neural networks for image classification.
- Classification Benchmark C++ Demo - Visualizes OpenVINO performance on inference of neural networks for image classification.
- Classification Benchmark C++ G-API Demo - Classification Benchmark C++ G-API version.
- Colorization Python* Demo - Colorization demo colorizes input frames.
- Crossroad Camera C++ Demo - Person Detection followed by the Person Attributes Recognition and Person Reidentification Retail, supports images/video and camera inputs.
- Face Recognition Python* Demo - The interactive face recognition demo.
- Formula Recognition Python* Demo - The demo demonstrates how to run Im2latex formula recognition models and recognize latex formulas.
- Gaze Estimation C++ Demo - Face detection followed by gaze estimation, head pose estimation and facial landmarks regression.
- Gaze Estimation C++ G-API* Demo - Face detection followed by gaze estimation, head pose estimation and facial landmarks regression. G-API version.
- Gesture Recognition Python* Demo - Demo application for Gesture Recognition algorithm (e.g. American Sign Language gestures), which classifies gesture actions that are being performed on input video.
- Gesture Recognition C++ G-API* Demo - Demo application for Gesture Recognition algorithm (e.g. American Sign Language gestures), which classifies gesture actions that are being performed on input video. G-API version.
- GPT-2 Text Prediction Python* Demo - GPT-2 text prediction demo.
- Handwritten Text Recognition Python* Demo - The demo demonstrates how to run Handwritten Text Recognition models for Japanese, Simplified Chinese and English.
- Human Pose Estimation C++ Demo - Human pose estimation demo.
- Human Pose Estimation Python* Demo - Human pose estimation demo.
- Image Inpainting Python* Demo - Demo application for GMCNN inpainting network.
- Image Processing C++ Demo - Demo application for enhancing the resolution of the input image.
- Image Retrieval Python* Demo - The demo demonstrates how to run Image Retrieval models using OpenVINO™.
- Image Segmentation C++ Demo - Inference of semantic segmentation networks (supports video and camera inputs).
- Image Segmentation Python* Demo - Inference of semantic segmentation networks (supports video and camera inputs).
- Image Translation Python* Demo - Demo application to synthesize a photo-realistic image based on exemplar image.
- Instance Segmentation Python* Demo - Inference of instance segmentation networks trained in
Detectron
ormaskrcnn-benchmark
. - Interactive Face Detection C++ Demo - Face Detection coupled with Age/Gender, Head-Pose, Emotion, and Facial Landmarks detectors. Supports video and camera inputs.
- Interactive Face Detection G-API* Demo - G-API based Face Detection coupled with Age/Gender, Head-Pose, Emotion, and Facial Landmarks detectors. Supports video and camera inputs.
- Machine Translation Python* Demo - The demo demonstrates how to run non-autoregressive machine translation models.
- Mask R-CNN C++ Demo for TensorFlow* Object Detection API - Inference of instance segmentation networks created with TensorFlow* Object Detection API.
- Monodepth Python* Demo - The demo demonstrates how to run monocular depth estimation models.
- MRI Reconstruction C++ Demo - Compressed Sensing demo for medical images
- MRI Reconstruction Python* Demo - Compressed Sensing demo for medical images
- Multi-Camera Multi-Target Tracking Python* Demo Demo application for multiple targets (persons or vehicles) tracking on multiple cameras.
- Multi-Channel Face Detection C++ Demo - The demo demonstrates an inference pipeline for multi-channel face detection scenario.
- Multi-Channel Human Pose Estimation C++ Demo - The demo demonstrates an inference pipeline for multi-channel human pose estimation scenario.
- Multi-Channel Object Detection Yolov3 C++ Demo - The demo demonstrates an inference pipeline for multi-channel common object detection scenario.
- Noise Suppression Python* Demo - The demo shows how to use the OpenVINO™ toolkit to reduce noise in speech audio.
- Noise Suppression C++* Demo - The demo shows how to use the OpenVINO™ toolkit to reduce noise in speech audio.
- Object Detection Python* Demo - Demo application for several object detection model types (like SSD, Yolo, etc).
- Object Detection C++ Demo - Demo application for Object Detection networks (different models architectures are supported), async API showcase, simple OpenCV interoperability (supports video and camera inputs).
- Pedestrian Tracker C++ Demo - Demo application for pedestrian tracking scenario.
- Place Recognition Python* Demo - This demo demonstrates how to run Place Recognition models using OpenVINO™.
- Security Barrier Camera C++ Demo - Vehicle Detection followed by the Vehicle Attributes and License-Plate Recognition, supports images/video and camera inputs.
- Speech Recognition DeepSpeech Python* Demo - Speech recognition demo: accepts an audio file with an English phrase on input and converts it into text. This demo does streaming audio data processing and can optionally provide current transcription of the processed part.
- Speech Recognition QuartzNet Python* Demo - Speech recognition demo for QuartzNet: takes a whole audio file with an English phrase on input and converts it into text.
- Speech Recognition Wav2Vec Python* Demo - Speech recognition demo for Wav2Vec: takes a whole audio file with an English phrase on input and converts it into text.
- Single Human Pose Estimation Python* Demo - 2D human pose estimation demo.
- Smart Classroom C++ Demo - Face recognition and action detection demo for classroom environment.
- Smart Classroom C++ G-API Demo - Face recognition and action detection demo for classroom environment. G-PI version.
- Smartlab Python* Demo - action recognition and object detection for smartlab.
- Social Distance C++ Demo - This demo showcases a retail social distance application that detects people and measures the distance between them.
- Sound Classification Python* Demo - Demo application for sound classification algorithm.
- Text Detection C++ Demo - Text Detection demo. It detects and recognizes multi-oriented scene text on an input image and puts a bounding box around detected area.
- Text Spotting Python* Demo - The demo demonstrates how to run Text Spotting models.
- Text-to-speech Python* Demo - Shows an example of using Forward Tacotron and WaveRNN neural networks for text to speech task.
- Time Series Forecasting Python* Demo - The demo shows how to use the OpenVINO™ toolkit to time series forecasting.
- Whiteboard Inpainting Python* Demo - The demo shows how to use the OpenVINO™ toolkit to detect and hide a person on a video so that all text on a whiteboard is visible.
To run the demo applications, you can use videos from https://storage.openvinotoolkit.org/data/test_data/videos.
You can download the Intel pre-trained models or public pre-trained models using the OpenVINO Model Downloader.
To build the demos, you need to source OpenVINO™ environment and get OpenCV. You can install the OpenVINO™ toolkit using the installation package for Intel® Distribution of OpenVINO™ toolkit or build the open-source version available in the OpenVINO GitHub repository using the build instructions.
For the Intel® Distribution of OpenVINO™ toolkit installed to the <INSTALL_DIR>
directory on your machine, run the following commands to download prebuilt OpenCV and set environment variables before building the demos:
source <INSTALL_DIR>/setupvars.sh
NOTE: If you plan to use Python* demos only, you can install the OpenVINO Python* package.
pip install openvino
For the open-source version of OpenVINO, set the following variables:
OpenVINO_DIR
pointing to a folder containingOpenVINOConfig.cmake
OpenCV_DIR
pointing to OpenCV. The same OpenCV version should be used both for OpenVINO and demos build.
Alternatively, these values can be provided via command line while running cmake
. See CMake search procedure.
Also add paths to the built OpenVINO™ Runtime libraries to the LD_LIBRARY_PATH
(Linux) or PATH
(Windows) variable before building the demos.
The officially supported Linux* build environment is the following:
- Ubuntu* 18.04 LTS 64-bit or Ubuntu* 20.04 LTS 64-bit
- GCC* 7.5.0 (for Ubuntu* 18.04) or GCC* 9.3.0 (for Ubuntu* 20.04)
- CMake* version 3.10 or higher.
To build the demo applications for Linux, go to the directory with the build_demos.sh
script and
run it:
build_demos.sh
You can also build the demo applications manually:
- Navigate to a directory that you have write access to and create a demos build directory. This example uses a directory named
build
:
mkdir build
- Go to the created directory:
cd build
- Run CMake to generate the Make files for release or debug configuration:
- For release configuration:
cmake -DCMAKE_BUILD_TYPE=Release <open_model_zoo>/demos
- For debug configuration:
cmake -DCMAKE_BUILD_TYPE=Debug <open_model_zoo>/demos
- Run
cmake --build
tool to build the demos:
cmake --build .
For the release configuration, the demo application binaries are in <path_to_build_directory>/intel64/Release/
;
for the debug configuration — in <path_to_build_directory>/intel64/Debug/
.
The recommended Windows* build environment is the following:
- Microsoft Windows* 10
- Microsoft Visual Studio* 2019
- CMake* version 3.14 or higher
To build the demo applications for Windows, go to the directory with the build_demos_msvc.bat
batch file and run it:
build_demos_msvc.bat
By default, the script automatically detects the highest Microsoft Visual Studio version installed on the machine and uses it to create and build
a solution for a demo code. Optionally, you can also specify the preferred Microsoft Visual Studio version to be used by the script. Supported
version is: VS2019
. For example, to build the demos using the Microsoft Visual Studio 2019, use the following command:
build_demos_msvc.bat VS2019
By default, the demo applications binaries are build into the C:\Users\<username>\Documents\Intel\OpenVINO\omz_demos_build\intel64\Release
directory.
The default build folder can be changed with -b
option. For example, following command will build Open Model Zoo demos into c:\temp\omz-demos-build
folder:
build_demos_msvc.bat -b c:\temp\omz-demos-build
You can also build a generated solution by yourself, for example, if you want to
build binaries in Debug configuration. Run the appropriate version of the
Microsoft Visual Studio and open the generated solution file from the C:\Users\<username>\Documents\Intel\OpenVINO\omz_demos_build\Demos.sln
directory.
You can also build the demo applications using cmake --build
tool:
- Navigate to a directory that you have write access to and create a demos build directory. This example uses a directory named
build
:
md build
- Go to the created directory:
cd build
- Run CMake to generate project files:
cmake -A x64 <open_model_zoo>/demos
- Run
cmake --build
tool to build the demos:
- For release configuration
cmake --build . --config Release
- For debug configuration:
cmake --build . --config Debug
The dependencies for Python demos must be installed before running. It can be achieved with the following command:
python -mpip install --user -r <omz_dir>/demos/requirements.txt
Python* ModelAPI is factored out as a separate package. Refer to the Python Model API documentation to learn about its installation. At the same time demos can find this package on their own. It's not required to install ModelAPI for demos.
###Build the Native Python* Extension Modules
Some of the Python demo applications require native Python extension modules to be built before they can be run.
This requires you to have Python development files (headers and import libraries) installed.
To build these modules, follow the instructions for building the demo applications above,
but add -DENABLE_PYTHON=ON
to either the cmake
or the build_demos*
command, depending on which you use.
For example:
cmake -DCMAKE_BUILD_TYPE=Release -DENABLE_PYTHON=ON <open_model_zoo>/demos
Once the modules are built, add the demo build folder to the PYTHONPATH
environment variable.
To build specific demos, follow the instructions for building the demo applications above,
but add --target <demo1> <demo2> ...
to the cmake --build
command or --target="<demo1> <demo2> ..."
to the build_demos*
command.
Note, cmake --build
tool supports multiple targets starting with version 3.15, with lower versions you can specify only one target.
For Linux*:
cmake -DCMAKE_BUILD_TYPE=Release <open_model_zoo>/demos
cmake --build . --target classification_demo segmentation_demo
or
build_demos.sh --target="classification_demo segmentation_demo"
For Microsoft Windows* OS:
cmake -A x64 <open_model_zoo>/demos
cmake --build . --config Release --target classification_demo segmentation_demo
or
build_demos_msvc.bat --target="classification_demo segmentation_demo"
Before running compiled binary files, make sure your application can find the OpenVINO™ and OpenCV libraries.
If you use a proprietary distribution to build demos,
run the setupvars
script to set all necessary environment variables:
source <INSTALL_DIR>/setupvars.sh
If you use your own OpenVINO™ and OpenCV binaries to build the demos please make sure you have added them
to the LD_LIBRARY_PATH
environment variable.
(Optional): The OpenVINO environment variables are removed when you close the shell. As an option, you can permanently set the environment variables as follows:
- Open the
.bashrc
file in<user_home_directory>
:
vi <user_home_directory>/.bashrc
- Add this line to the end of the file:
source <INSTALL_DIR>/setupvars.sh
- Save and close the file: press the Esc key, type
:wq
and press the Enter key. - To test your change, open a new terminal. You will see
[setupvars.sh] OpenVINO environment initialized
.
To run Python demo applications that require native Python extension modules, you must additionally
set up the PYTHONPATH
environment variable as follows, where <bin_dir>
is the directory with
the built demo applications:
export PYTHONPATH="<bin_dir>:$PYTHONPATH"
You are ready to run the demo applications. To learn about how to run a particular demo, read the demo documentation by clicking the demo name in the demo list above.
Before running compiled binary files, make sure your application can find the OpenVINO™ and OpenCV libraries.
Optionally, download the OpenCV community FFmpeg plugin using the downloader script in the OpenVINO package: <INSTALL_DIR>\extras\opencv\ffmpeg-download.ps1
.
If you use the Intel® Distribution of OpenVINO™ toolkit distribution to build demos,
run the setupvars
script to set all necessary environment variables:
<INSTALL_DIR>\setupvars.bat
If you use your own OpenVINO™ and OpenCV binaries to build the demos please make sure you have added
to the PATH
environment variable.
To run Python demo applications that require native Python extension modules, you must additionally
set up the PYTHONPATH
environment variable as follows, where <bin_dir>
is the directory with
the built demo applications:
set PYTHONPATH=<bin_dir>;%PYTHONPATH%
To debug or run the demos on Windows in Microsoft Visual Studio, make sure you
have properly configured Debugging environment settings for the Debug
and Release configurations. Set correct paths to the OpenCV libraries, and
debug and release versions of the OpenVINO™ libraries.
For example, for the Debug configuration, go to the project's
Configuration Properties to the Debugging category and set the PATH
variable in the Environment field to the following:
PATH=<INSTALL_DIR>\runtime\bin\intel64\Debug;<INSTALL_DIR>\extras\opencv\bin;%PATH%
where <INSTALL_DIR>
is the directory in which the OpenVINO toolkit is installed.
You are ready to run the demo applications. To learn about how to run a particular demo, read the demo documentation by clicking the demo name in the demos list above.
- Intel OpenVINO Documentation
- Overview of OpenVINO™ Toolkit Intel's Pre-Trained Models
- Overview of OpenVINO™ Toolkit Public Pre-Trained Models
* Other names and brands may be claimed as the property of others.