Skip to content

Official implementation of paper "Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with Uncertainty", ACM MM 2023.

License

Notifications You must be signed in to change notification settings

guspan-tanadi/AvatarKDbyGumpest

 
 

Repository files navigation

Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with Uncertainty (AvatarKD)

🔥 Official implementation of paper "Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with Uncertainty" (AvatarKD), ACM MM 2023.

By Yuan Zhang, Weihua Chen, Yichen Lu, Tao Huang, Xiuyu Sun and Jian Cao.

mask

Installation

Install MMRazor 0.x

git clone -b 0.x https://github.com/open-mmlab/mmrazor.git
cd mmrazor
pip install -v -e .

Prepare Data Set

Download on https://opendatalab.com

Note: if you want to distill on detection and segmentation, you should install mmdetection and mmsegmentation, respectively.

Reproducing our results

Train students with Avatars

This repo uses MMRazor as the knowledge distillation toolkit. For environment setup, please see mmrazor/README.md.

Train student:

cd mmrazor
sh tools/mmdet/dist_train_mmdet.sh ${CONFIG} 8 ${WORK_DIR}

Example for reproducing our reppoints_x101-reppoints-r50_coco result:

sh tools/mmdet/dist_train_mmdet.sh akd_cwd_reppoints_x101-reppoints-r50_coco.py 8 work_dirs/akd_rep_x101-fpn_x50

Results

  • Baseline settings:

    Student Teacher AvatarKD
    Faster RCNN-R50 (38.4) Faster RCNN-R101 (39.8) 40.9
    RetinaNet-R50 (37.4) RetinaNet-R101 (38.9) 40.3
    FCOS-R50 (38.5) FCOS-R101 (40.8) 42.9
  • Stronger teachers:

    Student Teacher AvatarKD
    Faster RCNN-R50 (38.4) Cascade Mask RCNN-X101 (45.6) 42.4
    RetinaNet-R50 (37.4) RetinaNet-X101 (41.0) 41.5
    RepPoints-R50 (38.6) RepPoints-R101 (44.2) 42.8

Visualization

mask

License

This project is released under the Apache 2.0 license.

Citation

@article{zhang2023avatar,
  title={Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with Uncertainty},
  author={Zhang, Yuan and Chen, Weihua and Lu, Yichen and Huang, Tao and Sun, Xiuyu and Cao, Jian},
  journal={arXiv preprint arXiv:2305.02722},
  year={2023}
}

About

Official implementation of paper "Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with Uncertainty", ACM MM 2023.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.2%
  • Shell 1.4%
  • Dockerfile 0.4%