The testing code for Deep wavelet prediction for image super-resolution, CVPRW, 2017, NTIRE 2017 Super-Resolution Challenge - DWSRx4.
Python package requirement:
- tensorflow w/GPU @ https://github.com/tensorflow/tensorflow
- pywt @ https://github.com/PyWavelets/pywt
- cv2 @ https://github.com/opencv/opencv
- In terminal, type in
python DWSRx4.py
- Then a promote asks for testing data set:
Please enter the testing path [hit enter to run default set]:
- Hit enter to run default testing set from DIV2K NTIRE which is stored at:
./Testx4Lum
- The final results will be stored at:
./Resultx4Lum
- Run
FinalColorSRx4.m
to generate final color SR and store the results in./Resultx4Color
- The testing data should be bicubic enlarged version of the original down-sampled version. For example, to generate
x4
super-resolution results, the originalx4
down-sampled low-resolution image should first be enlarged tox4
size, then fed the enlarged version to DWSR (as described in the fact sheet). UsegenerateTestX4.m
to generate enlarged LR luminance image. - The DWSR weights are stored at:
./Weightx4
- The DWSR model is defined in:
netx4.py
- The script is NOT for training.
The training code is not fully cleaned up; for academia purpose, please request training from here by providing basic usage information.
@inproceedings{guo2017deep,
title={Deep wavelet prediction for image super-resolution},
author={Guo, Tiantong and Mousavi, Hojjat Seyed and Vu, Tiep Huu and Monga, Vishal},
booktitle={The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
year={2017}
}
@inproceedings{timofte2017ntire,
title={Ntire 2017 challenge on single image super-resolution: Methods and results},
author={Timofte, Radu and Agustsson, Eirikur and Van Gool, Luc and Yang, Ming-Hsuan and Zhang, Lei and Lim, Bee and Son, Sanghyun and Kim, Heewon and Nah, Seungjun and Lee, Kyoung Mu and others},
booktitle={Computer Vision and Pattern Recognition Workshops (CVPRW), 2017 IEEE Conference on},
pages={1110--1121},
year={2017},
organization={IEEE}
}
Tiantong@iPAL2017, tong.renly@gmail.com