This repository contains codes for the training and evaluation of our CVPR-23 paper DART:Diversify-Aggregate-Repeat Training Improves Generalization of Neural Networks main and supplementary. The arxiv link for the paper is also available.
- Python 3.6.9
- PyTorch 1.8
- Torchvision 0.8.0
- Numpy 1.19.2
For training DART on Domain Generalization task:
python train_all.py [name_of_exp] --data_dir ./path/to/data --algorithm ERM --dataset PACS --inter_freq 1000 --steps 10001
set swad: True
in config.yaml file or pass --swad True
in the python command.
Similarly, to change the model (eg- VIT), swad hyperparameters or MIRO hyperparams, you can update config.yaml
file or pass it as argument in the python command.
python train_all.py [name_of_exp] --data_dir ./path/to/data \
--lr 3e-5 \
--inter_freq 600 \
--steps 8001 \
--dataset OfficeHome \
--algorithm MIRO \
--ld 0.1 \
--weight_decay 1e-6 \
--swad True \
--model clip_vit-b16
@inproceedings{jain2023dart,
title={DART: Diversify-Aggregate-Repeat Training Improves Generalization of Neural Networks},
author={Jain, Samyak and Addepalli, Sravanti and Sahu, Pawan Kumar and Dey, Priyam and Babu, R Venkatesh},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={16048--16059},
year={2023}
}