Skip to content

Basic implementation of a Neural Network for training on MNIST dataset

License

Notifications You must be signed in to change notification settings

Sbalbp/MNIST_basic_NN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MNIST_basic_NN

CircleCI

Basic implementation of a Neural Network for training on the popular MNIST dataset.

Two possible approaches are available:

  1. Simple Network: A simple 3-layered neural network trained over the full training dataset using backpropagation and gradient descent.

  2. Bagging ensemble: Several 3-leayered neural networks trained each over different random subsets (60% of the size) of the training dataset and aggregating their results to produce the final prediction in order to reduce the overfitting incurred by the original model.

Usage

First of all, download the MNIST dataset and unzip it into a local directory.

After this, you can just run the script:

python mnist_nn.py --nn

The following arguments are accepted by the script:

  • --help Display help.
  • --nn Indicates that training with a simple 3-layer Neural Network should be performed.
  • --bag Indicates that training using bagging with several models should be performed.
  • -d dir Indicates the directory where the dataset .idx files are located. By default it assumes the local directory.
  • -lr lr Indicates what the learning rate of the neural network should be. Default value is 0.2.
  • -n n Indicates how many different model should be built when using bagging. Default value is 15.
  • -e e Indicates for how many epochs each neural network should be trained (divided by 10 for bagging). Default value is 1000.

License

MIT License

About

Basic implementation of a Neural Network for training on MNIST dataset

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages