This repository contains the code for all the concepts required to understand deep learning from scratch. The code is written in python and numpy is used for numerical computations. The code is written in a way that it is easy to understand and can be used as a reference for understanding the concepts.
1-Probabability-Statistics/
- Contains code for probability concepts required for understanding deep learning.2-Linear Algebra/
- Contains code for linear algebra concepts required for understanding deep learning.3-Calculus/
- Contains code for calculus concepts required for understanding deep learning.4-Neural-Networks/
- Contains code for neural networks concepts.5-Convolutional Neural Networks/
- Contains code for convolutional neural networks concepts.6-Recurrent Neural Networks/
- Contains code for recurrent neural networks.7-Generative Adversarial Networks/
- Contains code for generative adversarial networks.8-Autoencoders/
- Contains code for autoencoders concepts.9-Optimization/
- Contains code for optimization concepts .10-Regularization/
- Contains code for regularization concepts.11-Loss Functions/
- Contains code for loss functions concepts.12-Activation Functions/
- Contains code for activation functions concepts.13-Model Evaluation/
- Contains code for model evaluation concepts.14-Hyperparameter Tuning/
- Contains code for hyperparameter tuning concepts.
15-Prob. ML by Kevin Murphy/
- Contains code and notes from the book Probabilistic machine learning - Part 1
- Dive into Deep Learning
- Deep Learning Specialization
- Deep Learning Book
- Minimum Viable Study Plan for Machine Learning Interviews
- Deep Learning Interviews
- CS 601.471/671 NLP: Self-supervised Models - Johns Hopkins University - Spring 2024
- Probabilistic Machine Learning: An Introduction by Kevin Murphy