VR-Enhanced Pages is a project that utilizes cutting-edge technology to enhance the learning experience for students. The project includes two models, TURB and STIR, which work together to provide an immersive and interactive educational experience.
TURB is a Transformer Model that powers the project's doubt bot and summarizer module. With the help of TURB, students can ask questions related to the video they are watching, and the bot will try to answer the question using the contents of the video's captions data. Additionally, the summarizer module shows a summarized version of the video's content as text after the video ends.
STIR is a text-to-3D model that allows students to enter a prompt and visualize it in 3D or 2D image based on the color scheme of the specific model if available in the video or else uses the model to generate on its own. STIR enhances the visual aspect of the learning experience and makes it easier for students to grasp complex concepts.
The project is built using the a-frame component framework QuickXR, which powers all the dynamic operations, CSS support, canvas support, video streaming, Web DOM support, and also improves latency by using Chrome's blink engine. QuickXR provides a seamless and immersive VR experience for students.
In the VR environment, students can enter a YouTube video ID, and the video will be loaded in the VR environment. Based on the video's captions, five questions will be generated, and students can evaluate themselves after watching the video. The TURB-powered doubt bot can answer any questions students might have while watching the video, and the STIR model can visualize prompts in 3D or 2D images.
To get started with VR-Enhanced Pages, simply visit the website https://vr-enhanced.pages.dev on a compatible VR headset like Meta Quest 2, and start immersing yourself in the enhanced learning experience. Please refer to the documentation for more information on how to navigate the VR environment and access the various features of the project.
Contributions to the project are welcome. Please refer to the CONTRIBUTING.md file for guidelines on how to contribute.
This project is licensed under the MIT License. Please refer to the LICENSE file for more information.