Skip to content

pawopawo/IMTA

 
 

Repository files navigation

IMTA

This repository contains the code (in PyTorch) for "Improved Techniques for Training Adaptive Deep Networks" paper by Hao Li*, Hong Zhang*, Xiaojuan Qi, Ruigang Yang and Gao Huang (* Authors contributed equally).

Introduction

This paper presents three techniques to improve the training efficacy of adaptive deep networks from two aspects: (1) a Gradient Equilibrium algorithm to resolve the conflict of learning of different classifiers; (2) an Inline Subnetwork Collaboration approach and a One-for-all Knowledge Distillation algorithm to enhance the collaboration among classifiers.

Method Overview.

Results

(a) Budgeted prediction results on ImageNet.

(b) Budgeted prediction results on CIFAR-100.

``

Dependencies:

  • Python3
  • PyTorch >= 1.0

Usage

We Provide shell scripts for training an MSDNet on ImageNet with GE, ISC and OFA.

Training an IMTA_MSDNet (block=5, step=4) on ImageNet.

  • Step 1: Training an MSDNet with GE from scratch
    Modify the run_GE.sh to config your path to the ImageNet, your GPU devices and your saving directory. Then run

    bash run_GE.sh
    
  • Step 2: Training the classifiers with ISC and OFA
    Modify the run_IMTA.sh to config your path to the ImageNet, your GPU devices and your saving directory (different from the saving directory of your GE_MSDNet). Please be noted that the MSDNet settings should be exactly the same as your trained GE_MSDNet, and the pretrained directory of IMTA_MSDNet should be the saving directory of your trained GE_MSDNet.

    bash run_IMTA.sh
    

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.2%
  • Shell 0.8%