Skip to content
/ CSGAN Public

Content with Style Generating Using Generative Adversarial Networks.

License

Notifications You must be signed in to change notification settings

dashidhy/CSGAN

Repository files navigation

CSGAN: Content with Style Generating Using Generative Adversarial Networks

This repository is built for my small but interesting project CSGAN. Please refer to my technical report for more details.

Prerequisites

  • Python3 (only tested on 3.6.4)
  • Pytorch 0.4.0 (does not support other versions)
  • NumPy (only tested on 1.14.2)
  • Matplotlib (only tested on 2.2.0)
  • Pillow (only tested on 5.1.0)
  • Jupyter Notebook
  • GPU support (a Nvidia GTX 1080 Ti is recommended)

File Organization

  • datasets ------ where you should set your datasets
  • dev ------ some in-developing codes
  • images ------ images for this README
  • models ------ codes for building our models
  • paper ------ a copy of our paper
  • savefigs ------ HD images generated from demo notebooks will be saved here
  • savemodels ------ two pretrained generators are provided here
  • styles ------ images for style transfer
  • utils ------ codes for data processing, visualization, and small modules
  • Generator_layer_understanding.ipynb ------ notebook for reproducing our layer analysis
  • StyleGAN_training.ipynb ------ notebook for reproducing our fine-tuning process
  • train_fixed_DCGAN.py ------ script for training a 64x64 DCGAN
  • train_fixed_LSGAN.py ------ script for training a 112x112 LSGAN

Getting Started

Installation

  • Clone this repo:
git clone -b master --single-branch https://github.com/dashidhy/CSGAN.git
cd ./CSGAN
  • Make sure you have all prerequisites ready

Style Transfer

Pretrained models and Jupyter Notebooks are provided for our fine-tuning process. You can play with them without downloading datasets. If you would like to reproduce the whole work, please read the sections below.

Prepare datasets

In our work, we use church-outdoor class of LSUN dataset for GAN training. Please refer to their README file for downloading. Finally, you should have a folder named church_outdoor_train_lmdb under ./datasets/LSUN/. You can also implement your own datasets, but this may lead to some works on the source codes.

Training a GAN

To train a 64x64 DCGAN, run:

python3 train_fixed_DCGAN.py -d

To train a 112x112 LSGAN, run:

python3 train_fixed_LSGAN.py -d

By using -d argument, you will run a training under our default settings. If you would like to set you own hyperparameters, use argument -h for help, or read and modify source codes directly.