By: Felix Klotzsche
As virtual reality (VR) technology continues to evolve, it becomes more accessible for cognitive neuroscientists to use it as a tool in research. This advancement allows for the study of behaviors in more immersive settings that extend beyond the limitations of conventional screen-based experiments. In this interactive workshop, we will explore together the current state of VR technology from an experimenter’s perspective. A key part of our workshop will be dedicated to the integration of techniques such as EEG and eye tracking into VR-based experiments. We will illustrate an exemplary workflow of extending an experimental paradigm, originally designed for a traditional 2D screen, into an immersive VR experiment. The workshop will consist of a theoretical and a practical part. In the first section, we will conceptually navigate through various challenges encountered during this process and discuss the advantages and constraints set by common VR headsets. We will discuss the trade-off between maintaining experimental control and achieving naturalism, spatial reference frames, managing timing aspects, and explore methodologies to incorporate the fact that data was captured in VR into the analysis of EEG and eye tracking data. In the second part of the workshop, we encourage participants to join us in programming a simple demo experiment using the Unity game engine. This practical exercise will provide insights into how some of the theoretical concepts discussed earlier are applied in a real-world implementation. Our goal is to offer a comprehensive, realistic overview of the process involved in establishing a VR-based experiment from the perspective of a cognitive neuroscientist, thereby enhancing understanding and intuition about whether VR integration represents a valuable option for specific research scenarios.
Shortcuts
- A PDF version of the slides for the theoretical part.
- Scenes:
- Full demo scene with all functionality implemented: ./Assets/Scenes/SceneFinal.unity
- Empty scene to rebuild the full scene yourself (only environment and infrastructure): ./Assets/Scenes/SceneEmpty.unit
- The Unity layout file
layout.wlt
which I am using for the practical part - LabStreaminglayer for Unity GitHub repository
Most important
If you run into problems, please do not hesitate to contact me (e.g., via email, see slides) or open an issue here. So if you have questions or want to work with the code or the slides, I am happy to support you.
How to get started for the practical part:
-
Install Unity Hub.
-
Create a Unity ID (Personal Plan): https://id.unity.com/
-
Within Unity Hub, install the latest “Official Release” (2022.3) of the Editor.
-
Clone this repository to a clean local directory.
-
In Unity Hub: click
Add
> select the repository folder. -
The project should now show up in the
Projects
section in your unity Hub. Click on it to open it. -
Now you should be ready to have fun. 😊
-
"Play" the experiment (easy):
- Open the full demo scene with all functionality implemented: ./Assets/Scenes/SceneFinal.unity.
- Press the
Play
button in Unity (top center). - In the
Game
window within your Unity Editor, you now see the participant's view. You can turn the "head" using your mouse or touchpad. To move around you can use theW
,A
,S
,D
keys (move: forward, left, back, right), as well asE
andQ
(move: up and down). - To start the experiment, click
Submit
in the control window on the left. Then pressStart Experiment
, thenProceed
(2 times until it is greyed out). You can now hide the control window by clicking onToggle eDIA
(bottom right corner in theGame
window). - To start the first trial look at the fixation target displayed on the tablet on the work bench (i.e., align the gaze target in your field of view with the fixation target). Now do the Posner task by "lookng" as fast as possible at the target objects left and right. At the beginning of each trial you have to look back at the fixation target.
-
Rebuild the experiment (advanced):
- Open the empty scene to rebuild the full scene yourself (only environment and infrastructure): ./Assets/Scenes/SceneEmpty.unit.
- Try to reconstruct the task by follow along while I am doing it or by peeking into the full scene every now and then.
You can use the
tags
in the repo to identify the according commits.
2024-02
: Workshop at the Dep. of Experimental Psychology, University of Oxford – 19 Feb 2024 (Felix Klotzsche & Sven Ohl)