This repository contains code demonstrating the method in our IJCAI 2022 paper Few-Shot Adaptation of Pre-Trained Networks for Domain Shift, and arXiv version containing both main manuscript and appendix.
This code makes use of Dassl.pytorch. Please follow the instructions at https://github.com/KaiyangZhou/Dassl.pytorch#installation to install dassl
.
We used NVIDIA container image for PyTorch, release 20.12, to run experiments.
This demonstration runs on PACS. Please download the dataset and save in lccs/imcls/data/
.
From lccs/imcls/scripts/
, run ./run_source.sh
.
The source models will be saved in lccs/imcls/output_source_models/
.
From lccs/imcls/scripts/
, run ./run_lccs.sh
.
The outputs after adaptation will be saved in lccs/imcls/output_results/
.
From lccs/imcls/results_scripts/
, first run ./collect_results.sh
and then ./consolidate_results.sh
.
@inproceedings{zhang2022lccs,
title = {Few-Shot Adaptation of Pre-Trained Networks for Domain Shift},
author = {Zhang, Wenyu and Shen, Li and Zhang, Wanyue and Foo, Chuan-Sheng},
booktitle = {Proceedings of the Thirty-First International Joint Conference on
Artificial Intelligence, {IJCAI-22}},
publisher = {International Joint Conferences on Artificial Intelligence Organization},
editor = {Lud De Raedt},
pages = {1665--1671},
year = {2022},
month = {7},
note = {Main Track},
doi = {10.24963/ijcai.2022/232},
url = {https://doi.org/10.24963/ijcai.2022/232},
}
Our implementation is based off the repository MixStyle. Thanks to the MixStyle implementation.