This repository contains the code for generating Universal-Noise Annotation (UNA), which is a more practical setting that encompasses all types of noise that can occur in object detection. Additionally, experiment configuration files, log files, and links to download pre-trained weights are included.
You can use this code to simulate various types of noise and evaluate the performance of object detection models under different noise settings. It provides a comprehensive framework for studying the impact of noise on object detection algorithms.
The paper was presented as an oral presentation at the Medical Imaging meets NeurIPS 2023.
UNA dataset can be generated by una_inj.py.
python una_inj.py --ratio 0.1
una_inj.py takes following arguments
--path
: The file path to the COCO annotation JSON file (e.g., ./instances_train2017.json).--target
: The file path where the UNA dataset will be stored.--output
: Prefix for the output file.--ratio
: Intensity of the synthesized noise.--class_type
: Select either 'coco' or 'voc'.
NOTE : Currently, una_inj.py only supports the COCO format and the VOC format.
- Download COCO dataset
First, you need to download the COCO dataset from the official website. You can obtain it from the official COCO website. Or you can use the linked script for convenience.
- Generate UNA dataset
You can use una_inj.sh
to generate the UNA dataset. Please refer to the provided argument explanations above and make the necessary modifications.
git clone https://github.com/Ryoo72/UNA.git
cd UNA
bash una_inj.sh
- Experiment
Enjoy your experiments using MMDetection, Detectron, or your own framework. You need to modify the config file according to your situation.
- Download PASCAL VOC dataset
PASCAL VOC dataset can be downloaded from the official website or mirror websites. Alternatively, you can use the 'voc_download.sh' script in the 'tools' directory.
cd tools
bash voc_download.sh
- Format converting
To use una_inj.py
, the dataset needs to be converted to COCO format. You can use the script provided by mmdetection for the conversion.
- Generate UNA dataset
You can use una_inj.sh
to generate the UNA dataset. Please refer to the provided argument explanations above and make the necessary modifications.
git clone https://github.com/Ryoo72/UNA.git
cd UNA
bash una_inj.sh voc
- Experiment
Enjoy your experiments using MMDetection, Detectron, or your own framework. If you want to convert the annotation file back to PASCAL VOC format, please refer to the following commands.
$ git clone https://github.com/KapilM26/coco2VOC.git
$ conda create --name voccoco python=3.8 -y
$ conda activate voccoco
$ cd coco2VOC
$ pip install -r requirements.txt
$ pip install --upgrade numpy
Note
Check out this link to use DeepLesion, a large-scale dataset of CT images.
We provide the download link for the UNA dataset that we used for the experiments. The copyright of the annotations follow the Creative Commons Attribution 4.0 License of the MS COCO dataset.
5% | 10% | 15% | 20% | 25% | 30% | 35% | 40% |
---|---|---|---|---|---|---|---|
link | link | link | link | link | link | link | link |
Experimental results for various types of detectors on the UNA setting.
Detector | Reference | Backbone | 0% | 10% | 20% | 30% | 40% |
---|---|---|---|---|---|---|---|
FasterRCNN | Ren et al. | ResNet-50 | 37.4 | 33.1 | 26.6 | 16.2 | 1.0 |
OHEM | Shrivastava et al. | ResNet-50 | 37.7 | 32.9 | 25.3 | 13.1 | 0.3 |
RetinaNet | Lin et al. | ResNet-50 | 36.5 | 32.4 | 27.6 | 17.1 | 1.8 |
FCOS | Tian et al. | ResNet-50 | 36.5 | 33.2 | 29.7 | 23.4 | 9.0 |
ATSS | Zhang et al. | ResNet-50 | 39.4 | 36.1 | 32.4 | 26.2 | 11.8 |
DINO | Zhang et al. | ResNet-50 | 49.0 | 43.6 | 36.5 | 29.7 | 15.5 |
- For more detailed setup, please refer to the configuration file.
- For more benchmarks, including the results on PASCAL VOC, please refer to our paper.
.
├── LICENSE
├── README.md
├── una_inj.sh
├── una_inj.py
├── figures
│ └── figure1,2
├── tools
│ ├── VOC_Gen.sh
│ ├── coco_download.sh
│ └── voc_download.sh
└── experiments
├── faster_rcnn_resnet50
│ ├── configs
│ │ └── {conf1.py},{conf2.py}...
│ └── logs
│ └── {log1.log},{log2.log}...
├── table4
│ ├── configs
│ │ └── {conf1.py},{conf2.py}...
│ └── logs
│ └── {log1.log},{log2.log}...
└── table6
├── configs
│ └── {conf1.py},{conf2.py}...
└── logs
└── {log1.log},{log2.log}...
Distributed under the MIT License. LICENSE contains more information.
If you have any questions or inquiries regarding the usage of this repository, please feel free to reach out to us at kwangrok.ryoo@lgresearch.ai. Your feedback and engagement are highly valued, and we look forward to hearing from you.