- This repo contains the Transformer variants implementation in PyTorch (
Transformer
/Transformer-XL
/R-Transformer
). PR is welcome. - If you are unfamilar with Transformer and its variants, refer to my blog: transformer explanation.
For attribution in academic contexts, please cite this work as:
@misc{chai2019-transformer-in-pytorch,
author = {Chai, Yekun},
title = {Transformer-in-PyTorch},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/cyk1337/Transformer-in-PyTorch}}
}
@misc{chai2019attn-summary,
author = {Chai, Yekun},
title = {{Attention in a Nutshell}},
year = {2019},
howpublished = {\url{http://cyk1337.github.io/notes/2019/01/22/NLP/Attention-in-a-nutshell/}},
}
References: