This project implements a Transformer-based model to predict the cursor velocity of a monkey based on its neural spike activity of Long-term recordings of motor and premotor cortical spiking activity during reaching in monkeys. The dataset consists of a session neural recordings from 71 channels sampled at ~100 Hz, with corresponding cursor velocity data. The goal is to model the relationship between neural activity and movement dynamics using a decoder-only Transformer, implemented from scratch.
data_loader.py
- Loads neural data from a remote source.preprocessing.py
- Processes spike times and aligns them with indices.model.py
- Defines the neural network architecture.train.py
- Trains the neural decoder model and evaluates performance.utils.py
- Contains helper functions for data normalization and conversion.
pip install -r requirements.txt
- Load Data
python data_loader.py
- Preprocess Data
python preprocessing.py
- Train Model
python train.py
- Test Model
python test.py
-
Neural Data:
spike_times
: (n_samples,)spike_times_index
: (n_channels,)
-
Cursor Velocity Data:
-
Binning Neural Data:
- Spikes are binned into 100ms windows, each containing 10 bins of 10ms.
-
Removing Abnormal Velocities:
- Data points where (|Vx| > 40) or (|Vy| > 40) are filtered out.
-
Tokenization and Temporal Embedding:
- Neural spikes are tokenized into numerical sequences.
- Temporal embeddings are applied before passing data into the Transformer.
The model is a decoder-only Transformer, implemented from scratch. It consists of:
- Multi-Head Self-Attention (with learnable heads)
- Feedforward Expansion Layer (scales feature dimensions)
- Positional Encoding
- Stacked Transformer Blocks
Test MSE: 13.135366, R2 Score: 0.689168
🔴 Current Issue: Multi-Session Training and Testing
✅ Possible Improvements: Accuracy Improvements
- Matin M.Babaei – Intern
- Arshia Afzal – Supervisor
EPFL License