Skip to content

digital-standard/ThreeDPoseUnityBarracuda

Repository files navigation

ThreeDPoseUnityBarracuda

This project is not maintained. Since it is still old, an error may occur, but please see it as such. Also, the code for ThreeDPoseTracker has been moved to a private repository, so I haven't maintained the code published on Github. We have not answered any questions regarding the code.

このプロジェクトはメンテナンスを行っていません。古いままなのでエラーが発生するかも知れませんがそういう物として見てください。 また、ThreeDPoseTrackerのコードはプライベートのリポジトリに移動したため、Githubに公開しているコードはメンテナンスしていません。 コードに関しての質問も返答しておりません。

Unity sample of 3D pose estimation using Barracuda

Outline

ThreeDPoseUnityBarracuda is a sample source which read the onnx by Barracuda and do threeD pose estimation on Unity. the accuracy got better than pre model.
*Be aware of that the target should be only one person. It does not work for multi target.

This sample lets the avatar named as "Unity chan" behaves same as the man on real time by estimating the 3D positions on the movie.

preview_daring.gif
preview_capture_v2.gif

Created with Unity ver 2019.3.13f1.
We use Barracuda 1.0.0 to load onnx.

Performance Report

GPU

GeForce RTX2070 SUPER ⇒ About 30 FPS
GeForce GTX1070 ⇒ About 20 FPS
※Without GPU, it does not work fine basically

Install and Tutorial

Download and put files

  1. Put the folders named ad "Assets" and "Packages" in your Unity Project.
    Now we have added project settings to the code. So please just download/clone them to your local PC.

  2. Download onnx from our home page by clicking following URL in our HP.
    https://digital-standard.com/threedpose/models/Resnet34_3inputs_448x448_20200609.onnx

Settings in Unity Inspector

  1. Open the Unity project with Unity Editor and put the onnx file in /Assets/Scripts/Model/ In this action, the onnx file is being converted into NNModel type of Barracuda automatically.

  2. Open "SampleScene" in "Scene" folder.
    If dialog shows up, please choose "Don't save".

  3. Set model
    Drag the NNModel you put before in Assets/Scripts/Model/ and drop it to "NN Model" in Gameobject named as "BarracudaRunner" in Inspector view. unity_inspector.PNG

  4. Start Debug
    Now you can see real time motion capture by starting Debug. unity_wiper_too_big.PNG

    But it would take about 15 secounds to load model while video has already started playing.
    ※It depends on machine how much time you need to wait for loading the model. unity_wiper_no_model.PNG

    You can avoid this problem by stopping playing video till it loads model completely.
    Please make playback speed of video player 0 to wair for loading the model.
    unity_debug_video_playback_speed.PNG

    And plase make the value 1 to restart the video after loading the model.

  5. Arrange Size
    Sometimes the avatar get out of the box like above screen shot.
    In this case, you should arrange the number in "Video Background Scale" of "MainTexture".
    The range is 0.1 ~ 1 and the default value is 1.
    Here please set this 0.8.
    unity_arrange_size.PNG

  6. Start Debug anain
    As you can see, the size of the avater fit the box. unity_wiper_size_suit.PNG

※Other Option

・ Choose Video
You can choose the target video.
Put the video you choose in Assets/Video/, and then drag the file and drop it to Video Clip of "Video Player".
unity_chooseVideo.PNG

・Choose Avatar
There are two types of avatar in this Scene.
You can change the avatar easily in inspector view.
Firstly activate Gameobject named as "Tait" and deactivate "unitychan".
Secondly drag the Gameobject and drop it to "V Nect Model" of "BarracudaRunner".
unity_set_anoter_avater_to_obj.PNG

*To determin direction of the face of avatar, a gameobject which works as nose has been added in those of avatars.
 So if you would like to adopt your original avatar, please add the nose referencing the code.

・Use Web Camera By checking "Use Web Cam", you can change the input images.
unity_use_web_cam.PNG

・Skip On Drop
If "Skip On Drop" in Video Player checked, VideoPlayer is allowed to skip frames to catch up with current time.

How to make a good estimate?

how_to_make_good_estimate.png

The frame displayed in the upper left corner (InputTexture) is the input image to the trained model. Make sure that the whole body fits in this frame. It is not possible to correctly estimate if the limbs stick out due to the edges of this frame. Since the program is performed assuming that the whole body is always in the frame, the error will increase if it exceeds the limit. Also, the background is as simple as possible, and pants are better than skirts.

Info

・Record
If you want to record the motion, the following package might be suitable.
https://github.com/zizai-inc/EasyMotionRecorder

License

Non-commercial use

・Please use it freely for hobbies and research.
When redistributing, it would be appreciated if you could enter a credit (Digital- Standard Co., Ltd.).

・The videos named as "Action_with_wiper.mp4"( original video: https://www.youtube.com/watch?v=C9VtSRiEM7s) and "onegai_darling.mp4"(original video: https://www.youtube.com/watch?v=tmsK8985dyk) contained in this code are not copyright free. So you should not use those of files in other places without permission.

Commercial use

・Non-commercial use only.

Unitychan

We follow the Unity-Chan License Terms.
https://unity-chan.com/contents/license_en/
Light_Frame.png

About

Unity sample of 3D pose estimation using Barracuda

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published