This file contains resources for different learning materials. Also, I have compiled best resources for learning different topics here. You could also go for these resources. I would like to thank the creator of these resources for helping us out.
In this folder, we have added different topics covering from basic deep learning topics to advanced models and their uses. This file contains a huge number of resources. Topics on which resources are compiled are given below:
- Machine Learning Techniques
- Deep Learning Techniques
- Computer Vision Models
- Natural Language Processing Models
- Basic Recurrent Neural Network
- LSTM/GRU based models
- Transformer Based Model
- Others
-
Stanford Machine Learning Course in Coursera:
This is one of the best course for learning machine learning for beginners. But, one drawback of this course is the assignments are all written in matlab. So, if you don't like the course you can just run the same thing in
python
. -
StatQuest with Josh Starmer: This youtube channel contains some great explanation on topics of Machine Learning. It is quite a good resource to follow.
-
Edureka: Edureka youtube channel covers a vast number of topics on machine learning techniques. Although the videos are quite lengthy but the topics are explained in a quite good manner.
-
Deep Learning Specialization:
Deep learning specialization is one of the best resources to learn the deep learning techniques. The first three courses in this specialization covers basic topics on deep learning like what is neural network and how does it work. But the later two courses are quite advanced. It covers Convolutional Neural Network as well as advanced model like Transformers which are explained in quite a handy way.
Link: Deep Learning | Coursera
-
Object Detection: (YOLO) One of the most impactful task in computer vision is object detection, which is detecting objects from a given image. There are different tutorials available to learn YOLO. The most recent version of YOLO algorithm is YOLOv5 which was implemented by Ultralytics. Their repository has a tutorial available for using their YOLOv5 implementation.
Link: Ultralytics YOLOv5
-
R-CNN family: Before YOLO, Regional CNN was used for object detection. Three different versions of R-CNN was available. Here are some blog links to learn all these in a easy way.
Link: Understanding Fast R-CNN and Faster R-CNN for Object Detection. | by Aakarsh Yelisetty | Towards Data Science Link: R-CNN, Fast R-CNN, Faster R-CNN, YOLO — Object Detection Algorithms | by Rohith Gandhi | Towards Data Science
-
Visualizing Neural Network: There are different blogs and videos representing how a neural network decides the output of the hypothesis. One of the best video that i could find is the first
5.22 min
of a video by Luis SerranoLink: A friendly introduction to Recurrent Neural Networks - YouTube
-
Recurrent Neural Network Theory Recurrent Neural Network is always difficult to understand. This video explains the recurrent unit in a very simple and efficient way.
Link: A friendly introduction to Recurrent Neural Networks - YouTube
-
Seq2Seq Model With Attention In this blog the importance of seq2seq model is explained as well as how the seq2seq model works
-
Seq2Seq Model generation of context Encoder or Seq2Seq takes 2 input. One is the input word and the other is the context or hidden state from previous unit. What happens on each timestep can be seen from here
Link: (Rolled Version): https://jalammar.github.io/images/seq2seq_5.mp4
(Unrolled Version): https://jalammar.github.io/images/seq2seq_6.mp4
-
Transformer Explained: Transformer itself is the core of all nlp models nowadays. It is one of the most core things to look for. Some great resources for understanding transformer is given here:
Link:
-
Transformer Code: Transformer Code Walkthrough is explained in the following tutorials: Link:
- gordicaleksa/pytorch-original-transformer: My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models. (github.com)
- Pytorch Transformers from Scratch (Attention is all you need) - YouTube
- The Annotated Transformer (harvard.edu)
-
Positional Encoding: Positional Encoding is one of the three key contributions of original transformer paper. But there are not a lot of tutorials on positional encoding. The following blog provides a good illustration on positional encoding:
Link: Transformer Architecture: The Positional Encoding - Amirhossein Kazemnejad's Blog
-
BERT: BERT is the one of the biggest models after transformer that are being used in different Natural Language Processing tasks like text classification. One of the key changes in BERT is BERT takes the encoder of transformer and modifies it. The full form of BERT is Bidirectional Encoder Representation of Transformer. BERT is often used for text classification. Here are some tutorials to learn BERT:
Link:
- The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning) – Jay Alammar – Visualizing machine learning one concept at a time. (jalammar.github.io)
- A Visual Guide to Using BERT for the First Time – Jay Alammar – Visualizing machine learning one concept at a time. (jalammar.github.io)
- BERT — transformers 4.5.0.dev0 documentation (huggingface.co)
- notebooks/text_classification.ipynb at master · huggingface/notebooks (github.com)
- Community — transformers 4.5.0.dev0 documentation (huggingface.co)