Skip to content

gaohuang/SnapshotEnsemble

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Snapshot Ensembles: Train 1, Get M for Free

This repository contains the Torch code for the paper Snapshot Ensembles: Train 1, Get M for Free.

The code is based on fb.resnet.torch by Facebook .

There is also a nice Keras implementation by titu1994.

Table of Contents

  1. Introduction
  2. Usage
  3. Contact

Introduction

Snapshot Ensemble is a method to obtain ensembles of multiple neural network at no additional training cost. This is achieved by letting a single neural network converge into several local minima along its optimization path and save the model parameters. The repeated rapid convergence is realized using multiple learning rate annealing cycles.

Figure 1: Left: Illustration of SGD optimization with a typical learning rate schedule. The model converges to a minimum at the end of training. Right: Illustration of Snapshot Ensembling optimization. The model undergoes several learning rate annealing cycles, converging to and escaping from multiple local minima. We take a snapshot at each minimum for test time ensembling.

Usage

  1. Install Torch ResNet (https://github.com/facebook/fb.resnet.torch);
  2. Clone the files to the fb.resnet.torch/ directory. Note that you need to replace train.lua with the one from this repository;
  3. An example command to train a Snapshot Ensemble with ResNet-110 (B = 200 epochs, M = 5 cycles, Initial learning rate alpha = 0.2) on CIFAR-100:

th main.lua -netType resnet -depth 110 -dataset cifar100 -batchSize 64 -nEpochs 200 -lrShape cosine -nCycles 5 -LR 0.2 -save checkpoints/

Contact

[gh349, yl2363] at cornell.edu   Any discussions, suggestions and questions are welcome!

About

Snapshot Ensembles in Torch (Snapshot Ensembles: Train 1, Get M for Free)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages