Skip to content

code for our paper "Attention Distillation: self-supervised vision transformer students need more guidance" in BMVC 2022

Notifications You must be signed in to change notification settings

wangkai930418/attndistill

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Kai Wang, Fei Yang and Joost van de Weijer

Requirements

Please check the packages in your environment with the "requirements.txt", normally these scripts don't depend on much on the package versions.

Reproducing

Please modify the data paths in "dino_teacher.sh" and "mugs_teacher.sh" to run the scripts by "bash xxx.sh".

Download

The teacher models checkpoints can be downloaded from the github repositories of DINO (https://github.com/facebookresearch/dino) and Mugs (https://github.com/sail-sg/mugs).

Others

If you have any question, do not hesitate to contact me or post an issue.

About

code for our paper "Attention Distillation: self-supervised vision transformer students need more guidance" in BMVC 2022

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published