Skip to content

Latest commit

 

History

History
54 lines (39 loc) · 2.61 KB

README.md

File metadata and controls

54 lines (39 loc) · 2.61 KB

SRGAN(Super-Resolution Generative Adversarial Network

A tensorflow implementation of Christian et al's "Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network" paper. ( See : https://arxiv.org/abs/1609.04802 ) This implementation is quite different from original paper. The differences are as followings:

  1. MNIST data set is used for convenience. ( It'll be straight-forward applying this scheme to large image data set like Urban 100 )
  2. I've completely replace MSE loss with GAN using tuple input for discriminator.( see training source code )
  3. I've used ESPCN ( sub-pixel CNN ) instead of deconvolution. ( see : http://www.cv-foundation.org/openaccess/content_cvpr_2016/papers/Shi_Real-Time_Single_Image_CVPR_2016_paper.pdf )

The existing CNN based super-resolution skill mainly use MSE loss and this makes super-resolved images look blurry. If we replace MSE loss with gradients from GAN, we may prevent the blurry artifacts of the super-resolved images and this is the key idea of this paper. I think this idea looks promising and my experiment result using MNIST data set looks good.

Dependencies

  1. tensorflow >= rc0.11
  2. sugartensor >= 0.0.1.7

Training the network

Execute


python train.py

to train the network. You can see the result ckpt files and log files in the 'asset/train' directory. Launch tensorboard --logdir asset/train/log to monitor training process.

Generating image

Execute


python generate.py

to generate sample image. The 'sample.png' file will be generated in the 'asset/train' directory.

Super-resolution image sample

This image was generated by SRGAN.

Other resources

  1. Original GAN tensorflow implementation
  2. InfoGAN tensorflow implementation
  3. Supervised InfoGAN tensorflow implementation
  4. EBGAN tensorflow implementation
  5. Time-series InfoGAN tensorflow implementation

Authors

Namju Kim (buriburisuri@gmail.com) at Jamonglabs Co., Ltd.