Reimplementation of 4x SwinIR https://arxiv.org/abs/2108.10257
Dataset | Bicubic | SwinIR |
---|---|---|
Set5 | 28.648 | 32.666 (32.920) |
Set14 | 26.406 | 29.020 (29.090) |
Urban100 | 23.220 | 27.133 (27.450) |
(number) comes from the paper.
[2023.05.04: Set14 result update]
Now gray-scale test image result is included. Please be aware that still this code does not use gray-scale test images.
Bicubic | SwinIR | GT |
---|---|---|
Item | Setting |
---|---|
Train Data | DIV2K, Flickr2K |
Preprocess | [-1,1] Normalization |
Random Transforms | Crop {DIV2K(48x48), Flickr2K(64x64)}, Rotation {90 degree} |
Validation Data | DIV2K |
Test Data | Set5, Set14, Urban100 |
Scale | 4x |
Optimizer | Adam |
Learning Rate | 2e-4 |
Scheduler | Reduce LR to half at 50%, 80%, 90%, 95% of 5e5 iterations |
Actual Trained Iterations | Around 4.3e5 |
Loss | L1 |
Batch | 8 {2 for each GPU, total 4 GPUs are utilized} |