Skip to content

Latest commit

 

History

History
90 lines (79 loc) · 2.41 KB

README.md

File metadata and controls

90 lines (79 loc) · 2.41 KB

YOLOX onnx inference

Code to infer images and videos using YOLOX's onnx model

YOLOXでonnxモデルを用いて画像と動画を推論するコード

1. Building the environment

1.1 pip install

pip install -U pip && pip install -r requirements.txt

1.2 Docker

1.2.1 docker build

FROM ubuntu:20.04
USER root

RUN ln -sf /usr/share/zoneinfo/Asia/Tokyo /etc/localtime

LABEL version="1.0"
LABEL description="Build operating environment for yolox-onnx-infer"

RUN apt-get update && \
    apt-get -y install python3-pip && \
    apt-get -y install git && \
    apt-get -y install libgl1-mesa-dev && \
    apt-get -y install libglib2.0-0 && \
    pip install -U pip && \
    pip install onnxruntime==1.13.1 opencv-python==4.6.0.66 pyyaml==6.0
FROM debian:stable-slim
USER root

LABEL version="1.0"
LABEL description="Build operating environment for yolox-onnx-infer"

RUN apt-get update && \
    apt-get -y install python3-pip && \
    apt-get -y install git && \
    apt-get -y install libgl1-mesa-dev && \
    apt-get -y install libglib2.0-0 && \
    pip install -U pip && \
    pip install onnxruntime==1.13.1 opencv-python==4.6.0.66 pyyaml==6.0
docker build -t tatsuya060504/yolox-onnx-infer:v1.0.0 .

1.2.2 Docker Hub

https://hub.docker.com/repository/docker/tatsuya060504/yolox-onnx-infer

docker pull tatsuya060504/yolox-onnx-infer:raspberrypi
#or
docker pull tatsuya060504/yolox-onnx-infer:wsl2

1.2.3 Docker run

docker run -it --name=yolox-onnx-infer -v $(pwd):/home tatsuya060504/yolox-onnx-infer:v1.0.0

2. Inference model download

You can download the yolox-onnx model by executing the shell script in the model folder.

cd model
sh download_yolox_<type>_onnx.sh

You can also get the model from the Google Drive link below.

3. Inference

3.1 Yaml file

project_config:
  mode: video
  input_path: sample.mp4
  output_dir: outputs

yolox_config:
  model_path: model/yolox_tiny.onnx
  class_score_thr: 0.3
  input_shape: 416,416
  with_p6: False
  device: cpu

3.2 Inference command

python onnx_inference.py