This package provides a simulator for vision-based tactile sensors, such as DIGIT. It provides models for the integration with PyBullet, as well as a renderer of touch readings. For more information refer to the corresponding paper TACTO: A Fast, Flexible, and Open-source Simulator for High-resolution Vision-based Tactile Sensors.
NOTE: the simulator is not meant to provide a physically accurate dynamics of the contacts (e.g., deformation, friction), but rather relies on existing physics engines.
For updates and discussions please join the #TACTO channel at the www.touch-sensing.org community.
The preferred way of installation is through PyPi:
pip install tacto
Alternatively, you can manually clone the repository and install the package using:
git clone https://github.com/facebookresearch/tacto.git
cd tacto
pip install -e .
This package contain several components:
- A renderer to simulate readings from vision-based tactile sensors.
- An API to simulate vision-based tactile sensors in PyBullet.
- Mesh models and configuration files for the DIGIT and Omnitact sensors.
Additional packages (torch, gym, pybulletX) are required to run the following examples.
You can install them by pip install -r requirements/examples.txt
.
For a basic example on how to use TACTO in conjunction with PyBullet look at [TBD],
For an example of how to use just the renderer engine look at examples/demo_render.py.
For advanced examples of how to use the simulator with PyBullet look at the examples folder.
- examples/demo_pybullet_digit.py: rendering RGB and Depth readings with a DIGIT sensor.
- examples/demo_pybullet_allegro_hand.py: rendering 4 DIGIT sensors on an Allegro Hand.
- examples/demo_pybullet_omnitact.py: rendering RGB and Depth readings with a OmniTact sensor.
- examples/demo_pybullet_grasp.py: mounted on parallel-jaw grippers and grasping objects with different configurations.
- examples/demo_pybullet_rolling.py: rolling a marble with two DIGIT sensors.
- examples/demo_pybullet_digit_shadow.py: enable shadow rendering.
NOTE: the renderer requires a screen. For rendering headless, use the "EGL" mode with GPU and CUDA driver or "OSMESA" with CPU. See PyRender for more details.
Additionally, install the patched version of PyOpenGL via,
pip install git+https://github.com/mmatl/pyopengl.git@76d1261adee2d3fd99b418e75b0416bb7d2865e6
You may then specify which engine to use for headless rendering, for example,
import os
os.environ["PYOPENGL_PLATFORM"] = "osmesa" # osmesa cpu rendering
We recommend to conduct experiments on Ubuntu.
For macOS, there exists some visualization problem between pybullet.GUI and pyrender as we know of. Please let us know if it can be resolved, and we will share the information at the repo!
This project is licensed under MIT license, as found in the LICENSE file.
If you use this project in your research, please cite:
@Article{Wang2022TACTO,
author = {Wang, Shaoxiong and Lambeta, Mike and Chou, Po-Wei and Calandra, Roberto},
title = {{TACTO}: A Fast, Flexible, and Open-source Simulator for High-resolution Vision-based Tactile Sensors},
journal = {IEEE Robotics and Automation Letters (RA-L)},
year = {2022},
volume = {7},
number = {2},
pages = {3930--3937},
issn = {2377-3766},
doi = {10.1109/LRA.2022.3146945},
url = {https://arxiv.org/abs/2012.08456},
}