Skip to content

MachineLearningSystem/AccPar

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AccPar

Partition tensors in layers for mutiple accelerators.

To compile:

make

To print partitioning results:

./accpar ./networks/Alexnet.txt -1

If you find this code useful in your research, please cite:

@inproceedings{song2020accpar,
title={Accpar: Tensor partitioning for heterogeneous deep learning accelerators},
author={Song, Linghao and Chen, Fan and Zhuo, Youwei and Qian, Xuehai and Li, Hai and Chen, Yiran},
booktitle={2020 IEEE International Symposium on High Performance Computer Architecture (HPCA)},
pages={342--355},
year={2020},
organization={IEEE}
}

@inproceedings{song2019hypar,
title={Hypar: Towards hybrid parallelism for deep learning accelerator array},
author={Song, Linghao and Mao, Jiachen and Zhuo, Youwei and Qian, Xuehai and Li, Hai and Chen, Yiran},
booktitle={2019 IEEE International Symposium on High Performance Computer Architecture (HPCA)},
pages={56--68},
year={2019},
organization={IEEE}
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 97.6%
  • C 1.3%
  • Makefile 1.1%