An implementation of a Multi-Layer Perceptron, with forward propagation, back propagation using Gradient Descent, training usng Batch or Stochastic Gradient Descent
Use: myNN = MyPyNN(nOfInputDims, nOfHiddenLayers, sizesOfHiddenLayers, nOfOutputDims, alpha, regLambda) Here, alpha = learning rate of gradient descent, regLambda = regularization parameter
from myPyNN import *
X = [0, 0.5, 1]
y = [0, 0.5, 1]
myNN = MyPyNN([1, 1, 1]]
Input Layer : 1-dimensional (Eg: X)
1 Hidden Layer : 1-dimensional
Output Layer : 1-dimensional (Eg. y)
Learning Rate : 0.05 (default)
print myNN.predict(0.2)
X = [[0,0], [1,1]]
y = [0, 1]
myNN = MyPyNN([2, 3, 1])
Input Layer : 2-dimensional (Eg: X)
1 Hidden Layer : 3-dimensional
Output Layer : 1-dimensional (Eg. y)
Learning rate : 0.8
print myNN.predict(X)
#myNN.trainUsingGD(X, y, 899)
myNN.trainUsingSGD(X, y, 1000)
print myNN.predict(X)
X = [[2,2,2], [3,3,3], [4,4,4], [5,5,5], [6,6,6], [7,7,7], [8,8,8], [9,9,9], [10,10,10], [11,11,11]]
y = [.2, .3, .4, .5, .6, .7, .8, .9, 0, .1]
myNN = MyPyNN([3, 10, 10, 5, 1])
Input Layer : 3-dimensional (Eg: X)
3 Hidden Layers: 10-dimensional, 10-dimensional, 5-dimensional
Output Layer : 1-dimensional (Eg. y)
Learning rate : 0.9
Regularization parameter : 0.5
print myNN.predict(X)
#myNN.trainUsingGD(X, y, 899)
myNN.trainUsingSGD(X, y, 1000)
print myNN.predict(X)
I ran this in OS X, after installing brew for command-line use, and pip for python-related stuff.
I designed the tutorial on Python 2.7, can be run on Python 3 as well.
- numpy
- matplotlib
- ipywidgets
The tutorial is an iPython notebook. It is designed and meant to run in Jupyter. To install Jupyter, one can install Anaconda which would install Python, Jupyter, along with a lot of other stuff. Or, one can install only Jupyter using:
pip install jupyter
ipywidgets comes pre-installed with Jupyter. However, widgets might need to be actived using:
jupyter nbextension enable --py widgetsnbextension
jupyter nbextension enable --py --sys-prefix widgetsnbextension
-
Yann LeCun's backprop paper, containing tips for efficient backpropagation
-
Mathematical notations for LaTeX, which can also be used in Jupyter
-
Fernando Pérez, Brian E. Granger, IPython: A System for Interactive Scientific Computing, Computing in Science and Engineering, vol. 9, no. 3, pp. 21-29, May/June 2007, doi:10.1109/MCSE.2007.53. URL: (http://ipython.org)