Skip to content

In this work, we propose a human-robot interaction framework based on three-dimensional mapping and virtual reality (VR) visualization.

Notifications You must be signed in to change notification settings

WangDongBUAA/Mapping_Interfacing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

In this work, we propose a human-robot interaction framework based on three-dimensional mapping and virtual reality (VR) visualization.

On the construction client, dense three-dimensional point-cloud maps are created based on the SLAM RTAB-Map algorithm with a Kinect-style depth camera.

A data communication network is employed to transmit the optimized models to the VR-based exploration client. Utilizing VR visualization, the operator can gain an intuitive and comprehensive understanding of the environments to be explored. The three-dimensional agricultural scenes are mapped into VR models through the system, which can combine the physical worlds with the VR spaces in a better manner.

The main goal of the research is to improve the human-robot interaction in the mapping task by virtue of VR. We propose a system framework for humans to assist in creating three-dimensional agricultural maps on a mobile robot. Meanwhile, to enhance the immersive exploration experience, VR devices are utilized to provide intuitive interfaces between humans and robots.

Importantly, we will continue to enhance this project in the subsequent stages, and provide more additional application and work support for users to download and utilize.

About

In this work, we propose a human-robot interaction framework based on three-dimensional mapping and virtual reality (VR) visualization.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published