Control Philips Hue lights using a Leapmotion hand tracker
-
Updated
Dec 6, 2022 - JavaScript
Control Philips Hue lights using a Leapmotion hand tracker
Ryutai is an interactive experience exploring the meditative effects of water. Requires a Leap-Motion controller
A Unity project that integrates Leap Motion technology for gestural commands and physical interactions in VR
Selection of my Pure Data projects
An online theremin developped to use the possibilities of Leap Motion.
Sports Casting App adapted to "intelligent" living room. Multimodalities are supported.
Custom classes made on the top of the Interaction Engine.
Leap Motion controlled UR5 virtual robot in RViz using MoveIt! with ROS
Leap Motion Interface Assets for IntuiFace
LEAPMidi
JS LeapMotion Sensor Fusion, using Mediapipe Hand tracking system to expand the possible hand gestures.
LeapMotion interactivity demonstration built around Babylon.js for a CSCI-438 Advanced Game Dev Class
VR game using leap motion sensor and unreal engine We developed a game for children with an autism spectrum disorder. This game helps the kids on ASD to improve their reaction time and motor coordination. Our team has published a research paper on this prototype.
Deprecated framework for using leapmotion hand sensors to interact with a physically accurate virtual environment. See https://github.com/quinnciccoretti/physvr for the functioning project.
StarkShoot is a multiplayer first-person shooter game developed with Unity3D, supporting various input devices like Kinect, Xbox controllers, Leap Motion, and VR glasses. Key features include login and game interfaces, player models and animations, realistic gun models, networking with Photon.
Add a description, image, and links to the leapmotion topic page so that developers can more easily learn about it.
To associate your repository with the leapmotion topic, visit your repo's landing page and select "manage topics."