This is the code for contrastive deep supervision and distilled contrastive deep supervision.
Install the based packages for training.
pip install torch torchvision
python train.py --model=$model name$
Before applying distilled contrastive deep supervision, you should first train a teacher model with contrastive deep supervision. Taking ResNet152 teacher as an example, you should run
python train.py --model=resnet152
Then, train the students with the following script.
python distill.py --model=$student name$ --teacher=$teacher name$ --teacher_path=$teacher checkpoint path$
Experiments on ImageNet
Please refer to the run.sh file in the folder to perform contrastive deep supervision and distilled contrastive deep supervision on ImageNet experiments. Note that you should train a teacher model before applying knowledge distillation.