A Python toolbox for analysing body movements across space and time, to aid the study of animal behaviour in neuroscience.
Create and activate a conda environment with movement installed:
conda create -n movement-env -c conda-forge movement
conda activate movement-env
Note
Read the documentation for more information, including full installation instructions and examples.
Pose estimation tools, such as DeepLabCut and SLEAP are now commonplace when processing video data of animal behaviour. There is not yet a standardised, easy-to-use way to process the pose tracks produced from these software packages.
movement aims to provide a consistent modular interface to analyse pose tracks, allowing steps such as data cleaning, visualisation and motion quantification. We aim to support a range of pose estimation packages, along with 2D or 3D tracking of single or multiple individuals.
Find out more on our mission and scope statement and our roadmap.
Warning
🏗️ The package is currently in early development and the interface is subject to change. Feel free to play around and provide feedback.
Contributions to movement are absolutely encouraged, whether to fix a bug, develop a new feature, or improve the documentation. To help you get started, we have prepared a detailed contributing guide.
You are welcome to chat with the team on zulip. You can also open an issue to report a bug or request a new feature.
If you use movement in your work, please cite the following Zenodo DOI:
Nikoloz Sirmpilatze, Chang Huan Lo, Sofía Miñano, Brandon D. Peri, Dhruv Sharma, Laura Porta, Iván Varela & Adam L. Tyson (2024). neuroinformatics-unit/movement. Zenodo. https://zenodo.org/doi/10.5281/zenodo.12755724
⚖️ BSD 3-Clause
This package layout and configuration (including pre-commit hooks and GitHub actions) have been copied from the python-cookiecutter template.