Skip to content

Implementing Neural Networks using Maths and Numpy only

Notifications You must be signed in to change notification settings

TheHarshal30/Curiosity

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Curiosity

Implementing Neural Networks from scratch

forthebadge

What are Neural Networks?

“Artificial” neural networks are inspired by the organic brain, translated to the computer. It’s not a perfect comparison, but there are neurons, activations, and lots of interconnectivity, even if the underlying processes are quite different

A single neuron by itself is relatively useless, but, when combined with hundreds or thousands (or many more) of other neurons, the interconnectivity produces relationships and results that frequently outperform any other machine learning methods

Why implenting from scratch?

Even though various libraries such as Pytorch, Tensorflow are available one cannot understand how neural networks actually work at core level by using the libraries.
Understanding how things work at core level helps one to tune in fine hyperparameters and even solve various errors

Topics covered:

  • Linear Activation
  • ReLU Activation
  • Sigmoid Activation
  • Softmax Activation
  • Binary Cross Entropy Loss
  • Categorical Cross Entropy Loss
  • Mean Absolute Error Loss
  • Mean Squared Loss
  • Stochastic Gradient Descent Optimizer (SGD)
  • Adagrad Optimizer
  • Adam Optimizer
  • Root Mean Squared Propagation Optimizer (RMSprop)

Final Model

The complete model is tested on Fashion MNIST Dataset
After spending some time and finding the best hyperparameters, an accuracy of 90% was achieved.

Support

⭐ Please Star and share the repository. Thanks! ❤️

About

Implementing Neural Networks using Maths and Numpy only

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages