-
Notifications
You must be signed in to change notification settings - Fork 23
/
README
33 lines (25 loc) · 1.63 KB
/
README
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
An open-source implementation of the Hessian-free optimization algorithm, based
on the "Deep Learning via Hessian-free Optimization" paper by James Martens
(ICML 2010, http://www.cs.toronto.edu/~jmartens/docs/Deep_HessianFree.pdf). The
implementation is in Theano (http://deeplearning.net/software/theano), a highly
efficient framework for defining and optimizing expressions with
multi-dimensional arrays, and re-uses a lot of code from the Deep Learning
Tutorials (http://www.deeplearning.net/tutorial).
The prerequisites for using the Hessian-free code are:
- a working installation of Theano in the python path
- a copy of the Deep Learning tutorials, which can be checked out from
https://github.com/lisa-lab/DeepLearningTutorials (simply add the code/
directory to PYTHONPATH)
- a one-time execution of the download.sh script (sh download.sh)
Once these pre-requisities are satisfied, you can run the code as follows:
python run_autoencoder.py -p jacobi -b 500 -e 50 -d mnist -o hf
this will load the MNIST dataset, perform 50 iterations of layer-wise RBM
pre-training, followed by Hessian-free optimization of the global
reconstruction cost using mini-batches of size 5000 and an (approximate) Jacobi
pre-conditioner. Supervised learning follows.
The code is companion to the following paper:
Olivier Chapelle and Dumitru Erhan. Improved preconditioner for hessian free
optimization. In NIPS Workshop on Deep Learning and Unsupervised Feature
Learning, 2011. http://olivier.chapelle.cc/pub/precond.pdf
For questions and/or comments, contact Olivier Chapelle and Dumitru Erhan
(usernames chap and dumitru on the yahoo-inc.com domain).