This is the pytorch implementation of the paper (accepted by IEEE T-IFS 2023).
-
Installation:
- First, install the required packages listed in
requirements.txt
by running the following command in the command line:pip install -r requirements.txt
- First, install the required packages listed in
-
Prepare Pretrained Model:
- The feature extraction phase of our identity hashing network uses a pre-trained SBS-R101. You can obtain it from FastReID's GitHub repository here.
- The steganography network is referenced from StegaStamp and needs to be retrained according to the image size of the dataset. You can find the StegaStamp GitHub repository here.
References:
- FastReID: A Pytorch Toolbox for General Instance Re-identification
@article{he2020fastreid, title={FastReID: A Pytorch Toolbox for General Instance Re-identification}, author={He, Lingxiao and Liao, Xingyu and Liu, Wu and Liu, Xinchen and Cheng, Peng and Mei, Tao}, journal={arXiv preprint arXiv:2006.02631}, year={2020} }
- StegaStamp: Invisible Hyperlinks in Physical Photographs
@inproceedings{2019stegastamp, title={StegaStamp: Invisible Hyperlinks in Physical Photographs}, author={Tancik, Matthew and Mildenhall, Ben and Ng, Ren}, booktitle={IEEE Conference on Computer Vision and Pattern Recognition (CVPR)}, year={2020} }
-
Dataset Preparation:
- Put the dataset to be poisoned into
./dataset
. - Set the input and output path, parameters of the pre-trained model, and the configuration of the victim model in
backdoor_implantation.py
. - Run the following command:
python backdoor_implantation.py
- Put the dataset to be poisoned into
Citation:
- If you use the DT-IBA method in your work, please cite this paper:
@ARTICLE{10285514, author={Sun, Wenli and Jiang, Xinyang and Dou, Shuguang and Li, Dongsheng and Miao, Duoqian and Deng, Cheng and Zhao, Cairong}, journal={IEEE Transactions on Information Forensics and Security}, title={Invisible Backdoor Attack With Dynamic Triggers Against Person Re-Identification}, year={2024}, volume={19}, number={}, pages={307-319}, doi={10.1109/TIFS.2023.3322659}}