Skip to content

mhmd-shadfar/Coursera-Ng-Improving-Deep-Neural-Networks-Hyperparameter-tuning-Regularization-and-Optimization

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization

Course can be found in Coursera

Quiz and answers are collected for quick search in my blog SSQ

  • Week 1 Practical aspects of Deep Learning
    • Recall that different types of initializations lead to different results
    • Recognize the importance of initialization in complex neural networks.
    • Recognize the difference between train/dev/test sets
    • Diagnose the bias and variance issues in your model
    • Learn when and how to use regularization methods such as dropout or L2 regularization.
    • Understand experimental issues in deep learning such as Vanishing or Exploding gradients and learn how to deal with them
    • Use gradient checking to verify the correctness of your backpropagation implementation
    • Initialization
    • Regularization
    • Gradient Checking
  • Week 2 Optimization algorithms
    • Remember different optimization methods such as (Stochastic) Gradient Descent, Momentum, RMSProp and Adam
    • Use random minibatches to accelerate the convergence and improve the optimization
    • Know the benefits of learning rate decay and apply it to your optimization
    • Optimization
  • Week 3 Hyperparameter tuning, Batch Normalization and Programming Frameworks
    • Master the process of hyperparameter tuning
    • Master the process of batch Normalization
    • Tensorflow

About

Short description for quick search

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • HTML 68.9%
  • Jupyter Notebook 28.9%
  • Python 2.2%