Reality fusion is a high performance and robust immersive robot teleoperation system that combines the best of both worlds: the high fidelity of neural rendering (3D Gaussian Splattings) and real-time stereoscopic point cloud projection.
- Reality Fusion Unity Project: Documentations for the
RealityFusionUnity
Unity project and source code for VR robot control applciations. - Reality Fusion Native Render Plugin: Instruction for compiling the original 3DGS Cmake project for a Unity native render plugin. Apre-compiled DLL files already avaliable in
Assest\Plugins\x86x64
in the Unity project. - Reality Fusion Robot Setup: Documentation for setting up the robot, including required packages.
Link to preprint: Reality Fusion: Robust Real-time Immersive Mobile Robot Teleoperation with Volumetric Visual Data Fusion
@article{reality-fusion,
author ={Li, Ke and Bacher, Reinhard and Schmidt, Susanne and Leemans, Wim and Steinicke, Frank },
title ={Reality Fusion: Robust Real-time Immersive Mobile Robot Teleoperation with Volumetric Visual Data Fusion},
journal={2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
year = {2024},
note = {to appear}
}
Contact: keli95566@gmail.com
This work was supported by DASHH (Data Science in Hamburg - HELMHOLTZ Graduate School for the Structure of Matter) with the Grant-No. HIDSS-0002.
Please check out INRIA's original license regarding the original implementation of 3DGS.