Official code of CVPR 2023 paper: Revisiting Reverse Distillation for Anomaly Detection.
The paper proposes the RD++ approach for anomaly detection by enriching feature compactness and suppressing anomalous signals through a multi-task learning design. For the feature compactness task, RD++ introduces the self-supervised optimal transport method. For the anomalous signal suppression task, RD++ simulates pseudo-abnormal samples with simplex noise and minimizes the reconstruction loss.
RD++ achieves a new state-of-the-art benchmark on the challenging MVTec dataset for both anomaly detection and localization. More importantly, when compared to recent SOTA methods, RD++ runs 6.x times faster than PatchCore and 2.x times faster than CFA, while introducing a negligible latency compared to RD.
- Revisiting Reverse Distillation for Anomaly Detection (CVPR 2023)
- Table of Contents
- Libraries
- Data Preparations
- Train
- Evaluation
- Quick Experiments
- Citation
- Acknowledgement
- geomloss
- numba
or (preferably whithin a fresh env to avoid conflicts):
pip install -r requirements.txt
Download MVTEC dataset from [Link]
To train and test the RD++ method on 15 classes of MVTEC, for example, with two classes: carpet and leather, please run:
python main.py --save_folder RD++ \
--classes carpet leather
If you only need to perform inference with checkpoints, please run:
python inference.py --checkpoint_folder RD++ \
--classes carpet leather
The pretrained weights can be found here [Google Drive]
Please cite our paper if you find it's helpful in your work.
@InProceedings{Tien_2023_CVPR,
author = {Tien, Tran Dinh and Nguyen, Anh Tuan and Tran, Nguyen Hoang and Huy, Ta Duc and Duong, Soan T.M. and Nguyen, Chanh D. Tr. and Truong, Steven Q. H.},
title = {Revisiting Reverse Distillation for Anomaly Detection},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2023},
pages = {24511-24520}
}
We use RD as the baseline. Also, we use the Simplex Noise. We are thankful to their brilliant works!