Skip to content

DeepLabV3Plus Pytorch onnx export tutorial, with implementation of pre-processing and post-processing in the code

Notifications You must be signed in to change notification settings

gaoxumustwin/DeepLabV3Plus-Pytorch-Model-deployment

Repository files navigation

DeepLabV3Plus-Pytorch-Model-deployment

The DeepLab repository used is: https://github.com/VainF/DeepLabV3Plus-Pytorch

onnx-export: Export onnx in DeepLabV3Plus Pytorch project

onnx-runtime: Import onnx and use onnx runtime for inference

ncnn_python_infer: ncnn_pythonn_infer uses NCNN's Python SDK for inference

ncnn-cpp-infer: ncnn-cpp-infer uses NCNN's C++ SDK for inference

rknn-toolkit2-convert: rknn-toolkit2-convert: Use rknn-toolkit2 to convert an onnx model to an rknn model and infer and save the results

rknn_deeplabv3plus_demo: rknn_deeplabv3plus_demo is code that uses rk3588NPU for inference

About

DeepLabV3Plus Pytorch onnx export tutorial, with implementation of pre-processing and post-processing in the code

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published