This repository contains everything necessary to reproduce the experiments in our paper: Compressibility and Generalization in Large-Scale Deep Learning (Wenda Zhou, Victor Veitch, Morgane Austern, Ryan P. Adams and Peter Orbanz).
The directory compression_lenet contais the scripts which implement the training, pruning, quantization and evaluation of the LeNet-5 network.
The directory compression-mobilenet contains the scripts which implement the pruning, quantization, and evaluation of noise stability of our network. The typical compression pipeline would be to take an already trained network (such as those available here) and prune then quantize the network. Please see the readme for more details on compressing MobileNet.
The directory randomization-cifar contains the scripts which implement the training and pruning of a ResNet-56 on the CIFAR-10 dataset with a portion of the labels randomized.