Skip to content

A neural network built in numpy w/ ReLU/Sigmoid layers using SGD/Adam optimization to solve bit AND (&).

Notifications You must be signed in to change notification settings

programjames/nn_from_scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

nn_from_scratch

A neural network built in numpy w/ ReLU/Sigmoid layers using SGD/Adam optimization to solve bit AND (&).

Activation Functions: Leaky ReLU and Sigmoid available, I ended up using Leaky ReLU.

Neural Net: Built with structure 2 -> 10 -> 1 (one hidden layer). Evaluates with batches.

Optimizers: SGD and Adam. The implementation of Adam is different than the paper, but mathematically equivalent. Also I penalize large weights.


Results:

0 AND 0 is -0.15923
0 AND 1 is -0.01149
1 AND 0 is -0.01174
1 AND 1 is 1.02833

About

A neural network built in numpy w/ ReLU/Sigmoid layers using SGD/Adam optimization to solve bit AND (&).

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published