Skip to content

aopy/mstd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

76 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MSTD* is a computational neuroscience project aimed at modeling motion and depth processing in the primate visual cortex.

It utilizes spiking neural networks (SNNs) based on the leaky integrate-and-fire and adaptive exponential integrate-and-fire neuron models.

Stimuli: The project includes artificial stimuli (moving bars in directions up, down, left, right) found in the "ds_models" directory and event camera recordings (TUM-VIE and MVSEC datasets) in the "of_models" and "v_models" directories.

Learning: The project employs Spike-Timing-Dependent Plasticity (STDP) and backpropagation to achieve selectivity for motion properties.

Software: The models are implemented using deep learning libraries such as PyTorch and Norse, which provide tools for constructing and simulating spiking neural networks.

Hardware: The models are capable of running on both CPU and GPU, with CUDA support if available, to enhance computational efficiency and performance.

MSTD stands for Medial Superior Temporal Dorsal.

About

SNN models for motion processing

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages