"generating annotated images of animals in complex environments with Unreal Engine"
by Fabian Plum, René Bulla, Hendrik Beck, Natalie Imirzian, and David Labonte (2023)
release_animation_replicAnt_compressed.mp4
replicAnt produces procedurally generated and richly annotated image samples from 3D models of animals. These images and annotations constitute synthetic data, which can be used in a wide range of computer vision applications. (a) The input to the replicAnt pipeline are digital 3D models; we generated high-fidelity models with the open-source photogrammetry platform - scAnt. Each model comprises a (b) textured mesh, (c) an armature which provides control over animal pose, and (d) a low-polygonal collision mesh to control the interaction of the model with objects in its environment. (e) 3D models are placed within scenes procedurally generated with the free software Unreal Engine 5. (f) Every scene consists of the same core elements, each with configurable randomisation routines to maximise variability in the generated data. 3D assets are scattered on a ground topology with complex topology; layered materials, decals, and light sources provide significant variability for the generated scenes. From each scene, we generate (g) image, (h) ID, (i) depth, and normal passes, accompanied by (j) an annotation data text file which contains information on image contents. Deep learning-based computer vision applications which can be informed by the synthetic data generated by replicAnt include (k) detection, (l) tracking, (m) 2D and 3D pose estimation, (n) and semantic segmentation.
- 16.04.2023 - Added first official release version 1.0.0
- Windows 10 (other operating systems may work but are untested)
- ~ 50 GB of disk space (the faster the better)
- Unreal engine itself will occupy roughly 30 GB
- Another ~5GB are required for the complete project including 3D assets and materials
- As a rough guide, 10k sample dataset at 2k resolution require ~5 GB (assuming all pass types are required)
- Dedicated GPU with 6GB VRAM (currently only tested on NVIDIA GPUs)
- 16 GB RAM
To convert the generated datasets into formats accepted by common computer vision frameworks, a number of parsers are provided in form of interactive Jupyter notebooks.
A python installation on your system is required to make use of these parsers. For ease of use, we provide an example conda environment, as well as a list of dependencies, in case you want to use a custom python installation / environment:
Install dependencies via conda
cd conda_environment
conda env create -f conda_replicAnt.yml
After the environment has been created successfully, re-start the terminal, and run the following line to activate the environment, and to continue the installation.
conda activate replicAnt
If you do not wish to install the pre-configured environment the relevant dependencies are:
- python >= 3.7
- pip
- notebook
- numpy
- matplotlib
- opencv
- json5
- pandas
- pathlib
- imutils
- scikit-learn
- You will first need to create an Epic games account which we will later link to your Github profile. This grants you access to the Unreal Engine source code, including Blender Plugins (i.e., Send2Unreal), and as a sweet bonus, access to Quixel’s asset library which holds additional meshes and materials that you can add to your generator environment for further customisation:
- Download and install the Epic Games Launcher. From there, your (to be) installed Unreal Engine environments can be managed and updated:
- Open the installed Epic Games Launcher and click on Unreal Engine on the left side of the window.
Select Library and click on the + icon to install a new version of Unreal Engine. replicAnt is build on Unreal Engine 5, so make sure to select the latest Unreal Engine 5.0 release, and follow the installation guide (Issues have been reported with builds later than 5.1, which we are currently investigating).
Unless you are planning on running extensive debugging or further development, installation of the core components should be sufficient. All additionally required functionality is provided in our project environment or, alternatively, can be installed later on.
- While your computer is busy installing Unreal Engine, connect your GitHub account to your newly created Epic Games account. For a thorough guide, refer to the (official documentation](https://www.unrealengine.com/en-US/ue-on-github)
In short, head over to your Epic Games account, and under Connections, connect to your GitHub profile. Simply follow the instructions prompted in your browser and authorize Epic Games. You will then receive a confirmation email to join the Epic Games organisation on GitHub to access all source code and plugins.
If all has gone well (fingers crossed, but we all know what computers are like...), your GitHub profile should confirm that you have successfully joined Epic Games!
- Once the Unreal Engine installation has completed in the background, restart your computer. Afterwards, we’ll set up the project.
- If you have not done so already, download the latest replicAnt release to your computer and unzip it.
Alternatively, you can clone the full GitHub repository, in case you want to actively partake in further developement:
git clone https://github.com/evo-biomech/replicAnt
- Once the download has finished, you will next need to download the curated set of assets (3D meshes and materials). NOTE: This project is running under a non-commercial license and any assets used for the generation of synthetic datasets may not be used or re-distributed for commercial purposes. It's all about the free stuff for everyone.
Download replicAnt external content files
Download and unpack the files into the Content directory of the replicAnt project.
- Launch replicAnt.uproject by double-clicking on the file. When opening the project for the first time, it may take up to 30 minutes to compile all shaders. You can use the spare time to install Blender and the latest version of Send2Unreal, or for some exercise.
- In the content browser, right-click on the file named replicAnt_Interface and select Run Editor Utility Widget.
Now, you should be able to see the replicAnt interface on the left side of your screen, where you can configure every part of the generator - from file types and simulated colonies, to adding further animals, and controlling the generator seed for benchmarking and debugging purposes.
In theory, you can now start generating your first datasets (using the provided subject models)!
If you wanted to bring your own (animal) models into the generator, kindly refer to the following detailed guides:
More guides on advanced usage and customisation will follow soon!
- Additional assets need to be downloaded and placed into the content folder. These files are hosted externally under the following link: Google Drive
- We regularly update a library of pre-configured subject models, which can be downloaded here.
The following subject models are currently available:
- Leafcutter ants - Atta vollenweideri (various worker sizes, ranging from 1.1 mg to 50.1 mg)
- Desert termites - Gnathamitermes (worker and soldier)
- Praying mantis - Gongylus gongylodes
- Stick insects - Sungaya inexpectata (various instars)
- Stick insects - Peruphasma schultei (male and female)
- Leaf-footed bug - Leptoglossus zonatus (adult)
- Desert ants - Pogonomyrmex desertorum (worker)
The replicAnt toolbox has many moving parts and uses experimental software tools, so minor (and sometimes major) hiccups will be common, like the cold, in particular when the pipeline is extended, or Unreal and Blender are updated. To aid the struggling user, we compiled a list of common issues and, more importantly, also describe how to fix them:
In case the issue you are encountering is not listed in the Troubleshooting guide, feel free to open an issue, using the bug report template. We will try to reply to all issues quickly, but bear in mind that this is free software, and sometimes replies can take some time.
When using replicAnt and/or our other projects in your work, please make sure to cite them:
@article{PlumLabonte2021,
title = {scAnt — An open-source platform for the creation of 3D models of arthropods (and other small objects)},
author = {Plum, Fabian and Labonte, David},
doi = {10.7717/peerj.11155},
issn = {21678359},
journal = {PeerJ},
keywords = {3D,Digitisation,Macro imaging,Morphometry,Photogrammetry,Zoology},
volume = {9},
year = {2021}
}
@misc{Plum2023b,
title = {OmniTrax},
author = {Plum, Fabian},
resource = {GitHub repository},
howpublished = {https://github.com/FabianPlum/OmniTrax},
year = {2022}
}
@article{Plum2023a,
title = {replicAnt: a pipeline for generating annotated images of animals in complex environments using Unreal Engine},
author = {Plum, Fabian and Bulla, René and Beck, Hendrik K and Imirzian, Natalie and Labonte, David},
doi = {10.1038/s41467-023-42898-9},
issn = {2041-1723},
journal = {Nature Communications},
url = {https://doi.org/10.1038/s41467-023-42898-9},
volume = {14},
year = {2023}
}
Pull requests are warmly welcome. For major changes, please open an issue first to discuss what you would like to change.
© Fabian Plum, Rene Bulla, David Labonte 2023 MIT License