A project within the module "ubiqutous computing". The aim of the project is to create a system that controls the posture of a user when lifting heavy objects and gives suggestions for improvement. https://youtu.be/2RV6aqpohyI
- Visualize back posture
- Check if person is lifing something
- Recognize a bad posture
- Alert user by beeping or vibrating
- Solder ESP8266 #1
- Solder ESP8266 #2
- Test FSR
- Solder ESP8266 with FSR #3
- make client.ino for fsr
- Make client.ino work
- Make server.ino work
- Test with two clients
- Test with three clients
- Make server.ino work with webserver
- Add status display for clients
- Add beeper to foot https://www.arduino.cc/en/Tutorial/BuiltInExamples/toneMelody
- Add beeper to back
- Request status on boot for clients
- Add function to send data to clients
- Server.js should process data further
- Calibrate MPU6050 Sensors
- Calculate right angles from mpu data
- Display the data on frontend
- Look into neural networks for posture recognition
- Improve webserver data handling
- Smooth values from sensors
- Use multiple axis
- Find better method to time the training
- Restrict training to 20 lines per file
- Fixed null error in training
- Visualize all raw data
- Use visualisations to process data further
- ESP8266/Client -> The sketch for the ESP8266 which have the MPU6050 connected, connect to the AP and send the sensors data
- ESP8266/Server -> The sketch for the ESP8266 which opens the wifi AP and listens for the websocket data
- ESP8266/FSR -> The sketch for the ESP8266 which have the FSRs connected, connect to the AP and send the sensors data
- Server -> The node.js server which get the data from the serial port connected to the Access Point ESP8266
- src -> The parcel webserver files
- data_analysis -> Everything about the data related stuff (mostly csv and jupyter notebooks)
Screenshot of the current version of the frontend
node server/record_data.js
Works best if you have a wireless keyboard/controller and bind a/b to one of the buttons.
- Connect all the sensors and servermcu to usb
- Wear the sensors
- Press 'a' or 'b' to start a recording
- Perform a motion
- Press 'a' or 'b' again to stop the recording
Data recording is done via the node js script. Mostly inspired by how charliegerad did it in her project: https://github.com/charliegerard/gestures-ml-js
Main file is data_analysis/complete_set_analysis.ipynb
. Takes all files from server/data
, parse their names and reads them into a pandas dataframe.
Data analysis is heavily inspired by ATLVHEAD project and his detailed video about it: https://github.com/ATLTVHEAD/Atltvhead-Gesture-Recognition-Bracer