Skip to content

MINT, Multiplier-less INTeger Quantization for Energy Efficient Spiking Neural Networks, ASP-DAC 2024, Nominated for Best Paper Award

Notifications You must be signed in to change notification settings

RuokaiYin/MINT_Quantization

Repository files navigation

MINT_Quantization

TODO:

I will clean up the codes soon...

Notice:

I found the code to have some errors when using different PyTorch versions. I will solve the problem later. For now, please run the code using PyTorch with version 1.13.0. This version is tested to be working. Thanks.

Citing

If you find MINT is useful for your research, please use the following bibtex to cite us,

@inproceedings{yin2024mint,
  title={MINT: Multiplier-less INTeger Quantization for Energy Efficient Spiking Neural Networks},
  author={Yin, Ruokai and Li, Yuhang and Moitra, Abhishek and Panda, Priyadarshini},
  booktitle={2024 29th Asia and South Pacific Design Automation Conference (ASP-DAC)},
  pages={830--835},
  year={2024},
  organization={IEEE}
}

About

MINT, Multiplier-less INTeger Quantization for Energy Efficient Spiking Neural Networks, ASP-DAC 2024, Nominated for Best Paper Award

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages