This is the repository of the Seal Team for the HackZurich 2023. We are a team of 5 students from different universities in Europe.
We are participating in the Logitech Challenge and our goal is to create a solution that helps people to focus on their work using Logitech products.
The inspiration behind "Mindtrics - Mind metrics. Made simple." stems from the growing importance of mental well-being and cognitive performance in our fast-paced digital world. We wanted to create a tool that effortlessly integrates into daily life, leveraging cutting-edge EEG technology and Logitech's superior audio hardware to provide users with actionable insights into their focus levels and mental state.
Our project, Mindtrics, uses the following:
- Logitech webcam
- Logitech headphones (with the proposed application of EEG electrodes onto the headphone frame)
- Logitech keyboards, and mice In order to create a seamless way to maintain the state of flow when carrying out any productive tasks.
Instead of doing a timer-based system which can lead to breaking of the state of flow for users, we focus on ensuring the state of flow is not broken and users can extend these flow ranges beyond just a set timer. Our project incorporates real-time EEG data acquisition which is used to identify the level of focus that a person has had throughout time. In case the person goes below a specific threshold of focus, we can indicate this using our interactive website or through peripherals that allow RGB lighting. This provides a subtle way of indicating if the user is still in a state of flow or if they are unable to focus.
We used OpenBCI's 8-channel EEG headset whose electrodes we plan on incorporating through the Logitech headphones. We used a Band Pass filter and notch filter to carry out the initial filtering of the signals obtained from the EEG electrodes and then convert them into Alpha, Beta, Gamma, Theta and Delta frequency wave categories. We use the ratio between Alpha and Beta waves to determine the level of focus the user has at different standpoints and a threshold to determine if they are focused or not. The logic is similar to most of the focus determining BCI tools available on the market and is possible to carry out with even 4 electrodes.
Our tool also uses a gaze detection neural network in order to identify eye movement patterns while performing tasks in front of the computer to check for changes in the direction of gaze.
- Integration of EEG electrodes seamlessly into the Logitech + PC pipeline
- Integration of multi-processing to run multiple neural networks, and algorithms over laptops
- Distributing computing workload across different laptops
- Integrating the UI for the live data streaming and results visualization.
- Making of calibrated out-of-the-box gaze-to-screen estimation module.
- Aggregation of collected telemetry to a score-based system.
- Integration of all the different metrics into the dashboard and focus-based scoring system.
- Incorporating activity measurement and several other metrics for accurately identifying focus and "state-of-flow"
Throughout this project, we deepened our understanding of EEG technology, signal processing, gaze estimation and BCI algorithms. The experience taught us valuable lessons in hardware-software integration, user-centric design, and the significance of precise data analysis in providing meaningful insights.
Looking ahead, we envision further refinement and miniaturization of the EEG integration to enhance user comfort and convenience. We plan to develop a user-friendly interface or mobile application to present the focus metrics in an easily understandable and actionable format.
Additionally, we aim to incorporate machine learning techniques to personalize the threshold for focus determination based on individual user profiles, making Mindtrics a truly tailored mental wellness tool.
We used OpenBCI's 8-channel EEG headset whose electrodes we plan on incorporating through the Logitech headphones. We used a Band Pass filter and notch filter to carry out the initial filtering of the signals obtained from the EEG electrodes and then convert them into Alpha, Beta, Gamma, Theta and Delta frequency wave categories. We use the ratio between Alpha and Beta waves to determine the level of focus the user has at different standpoints and a threshold to determine if they are focused or not. The logic is similar to most of the focus determining BCI tools available on the market and is possible to carry out with even 4 electrodes.
Activity recognition was carried out using mmaction
BCI was done using brainflow
Since gaze tracking model is using existing repo, you will also need to pull submodule for that model before running.
Gaze estimation is doing classic 2d vector gaze estimation and outputs statistics on first order rates of gaze change. To run Gaze estimation with Plotly Dash dashboards (as in the figure above):
python gaze_estimation/gaze_2d_dash.py
To run Gaze estimation standalone:
python gaze_estimation/gaze_2d_csv.py
To connect Gaze estimation remotely to Mindtrics General Dashboard:
python message_gaze_and_mouse_estimation.py
To run the dashboard, you need to have Node.js installed. Then, you can run the following commands in the dashboard folder:
npm install
npm run dev
The dashboard will be available at http://localhost:3000
The dashboard is divided in 4 parts:
- The top part shows the statistics of your sessions, and the current session.
- The bottom part shows the data of the session over a specific time period. The four graphs show
different aspect of the session:
- The first graph shows the focus level over time. The focus level is computed using the alpha and beta waves of the EEG.
- The second graph shows the activities detected by the webcam over time.
- The third graph shows the mouse position on the screen.
- The final graph shows the gaze velocity over time.
You can also start a new session by clicking on the Start Focusing!
button.
See the LICENSE Information on the HackZurich Webpage.