- Ubuntu 18.04, python 3.8, A100
- PyTorch 1.8.1 + CUDA 11.1
bash ./download.sh
the files will be downloaded and saved in the following folders:
pittsburgh
├── database
├── query
└── structure
logs
├── student_contrast
├── student_quadruplet
├── student_triplet
└── teacher_triplet
# STUN
python main.py --resume=logs/student_triplet/ckpt.pth.tar
# STUN (Constrast)
python main.py --resume=logs/student_constrast/ckpt.pth.tar
# STUN (Quadruplet)
python main.py --resume=logs/student_quadruplet/ckpt.pth.tar
# Standard Triplet
python main.py --phase=test_tea --resume=logs/teacher_triplet/ckpt.pth.tar
python vis_results.py
# you can plot results of different models by populate the NETWORK variable.
# train the teacher net
python main.py --phase=train_tea --loss=tri
# train the student net supervised by the pretrained teacher net
python main.py --phase=train_stu --resume=[teacher_net_xxx/ckpt_best.pth.tar]
After analyzing empirical figures, we found the correlation between recall@N and uncertainty level evolve into a sensible trend after 30 epochs. But ECE (Expected Calibration Error) will diverge if the student network is excessively trained. As a result, we focused our examination on the model's performance from epoch=30 to epoch=35 and chose the one with the lowest ECE.
# evaluate
./eval_batch.sh
If you find our work useful, please consider citing:
@INPROCEEDINGS{stun_cai,
author={Cai, Kaiwen and Lu, Chris Xiaoxuan and Huang, Xiaowei},
booktitle={2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
title={STUN: Self-Teaching Uncertainty Estimation for Place Recognition},
year={2022},
volume={},
number={},
pages={6614-6621},
doi={10.1109/IROS47612.2022.9981546}}