Skip to content

Latest commit

 

History

History
23 lines (18 loc) · 1.26 KB

README.md

File metadata and controls

23 lines (18 loc) · 1.26 KB

🎒 🔧 Attention is all you need!

"A transformer is a deep learning model that adopts the mechanism of self-attention, differentially weighting the significance of each part of the input data." ~wiki.

‘What I cannot create. I do not understand.’ ~ Richard Feynman (1918 – 1988) 📺 🔗

Table of content

  1. Tutorials
  2. References

Clone repository

The github repository link is https://github.com/mxochicale/transformers-tutorials

To clone this repo, you might need to generate your SSH keys as suggested here. You can then clone the repository by typing (or copying) the following line in a terminal at your selected path in your machine:

cd && mkdir -p repositories/mxochicale && cd repositories/mxochicale
git clone git@github.com:mxochicale/transformers-tutorials.git

Issues

If you have questions or have experiment any problems, please open an issue.