Skip to content

codestar12/Parallel-Independent-Blockwise-Distillation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This repository is for our paper "Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression" link

A method for distributing blockwise compression of models accross many workers using Tensorflow and MPI

Overview Image

many of the notebooks depend on the pretrained vgg16 and resnet50 fine-tuned on upscaled cifar10. For size reasons .h5 files are not tracked on this repo. If cloning you should download the .h5 files from google drive at the following links

bash scripts are contained in is repo to build the docker image, start the docker container and start the jupyterlab instance needed for this project.

bash scripts need to be run with sudo permissions.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published