The objective of the project is to:
- Use a low-power solution to turn on a Wake-on-LAN (WoL) compatible computer
Useful when the computer has to be on remotely or if the power button is faulty.
- Understand the advantages and limitations of implementing Artificial Intelligence (AI) in a low-powered embedded system
Details are included in
Project Report.pdf
.
- Arduino Nano 33 BLE Sense
- Raspberry Pi (Preferably Pi 3 Model B and above)
- Computer with Wake-on-LAN functionality turned on and hybrid sleep turned-off
See
Communication between the devices
section inProject Report.pdf
for more details.
- TTP223B Digital Capacitive Touch Sensor
- RGB LED
Note: Please remove the Touch Sensor and/or RGB LED codes from
Codes/IMU_Classifier.ino
if unused.
In this submission folder, the following files are available:
Presentation Slides.pptx
: Brief overview of the project, including recorded demonstration for the project.Project Report.pdf
: Detailed Report covering the data collection process, inference performance, optimization techniques and others.Codes/IMU_Capture.ino
folder: Code to capture the acceleration and gyroscope data from on-board IMU.Codes/IMU_Classifier.ino
folder: Codes for classification based on captured acceleration and gyroscope data. Added codes for Touch Sensor and RGB LED to allow for additional functionality over in-built components.Codes/Node-Red
folder: Codes used to control the Raspberry Pi based on the classified results fromCodes/IMU_Classifier.ino
.Codes/Wake on LAN(WoL).ipynb
: Code used for training of the machine learning model based on the captured data fromCodes/IMU_Capture.ino
.
Note: Only Step 1, 2, 8, 9, 10 and 11 is required if the gestures implemented is good enough.
- Clone the repository or download the project code to your preferred disk location.
- Connect the Arduino Nano Sense board to the computer and install the required drivers.
- Open
IMU_Caputer.ino.ino
file, flash the code and capture the data. - Store all the captured data into a csv file.
- Repeat steps 3 & 4 for the different kind of gestures you want to capture.
- Open the
Wake on LAN(WoL).ipynb
, modify the code to read from the csv files you captured, and do the training.
Note: Training can take a long time, so do consider running the code remotely through Google Colab or Kaggle with GPU/TPU on.
- Save the models, and convert the selected model into the HEX file.
Reference code is available in the
Wake on LAN(WoL).ipynb
file.
- Import the HEX file into
IMU_Classifier.ino.ino
file, and modify the code to suit your gestures and flash the code into the Arudino Nano Sense board. - Connect the Raspberry Pi to the Arduino Nano Sense, install
Node-Red
andetherwake
, and import the codes in theCodes/Node-Red
folder into the Raspberry Pi. - Open
wol.sh
file, change theXX:XX:XX:XX:XX:XX
to the targeted MAC address.
Note: Ensure the computer's LAN port allows for Wake-on-LAN functionaity.