Skip to content

Regularized and extrapolative season-to-season transfer for scene adaptation

Notifications You must be signed in to change notification settings

DhyeyR-007/Season-Adaptation-Neural-Transfer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 

Repository files navigation

Season/Scene-Neural-Transfer

Here in order to extend the concept of scene transfer/style transfer into an unpaired image-to-image translation, we code cycleGAN netowork in Python using Tensorflow and Keras librarires for image data processing and neural network development.

The entire programming is done in Google Colaboratory.

This code here is my attempt to further undertand the concept of scene transfer. Earlier using Neural Style Transfer in a very detailed format I learned a lot about style transfer and domain adaptation. Being extremely fascinated by the intuition of domain adaptation, I felt working upon the concept of season transfer is a great segue to work further upon scene adaptative object segmentation and depth estimation.

Inspired by the Horse2Zebra cycleGAN methodology (which was the analogy I used during initial phases of understanding) I use the network, with suitable hyperparameters for season transfer:

image

The outputs obtained here are as follows:

From training dataset where we go from Summer(Real image) ----> Winter (Generated image) ----> Summer (Reconstructed image) (i.e. A ----> B ----> A')

image

Now if we test the generators using data from testing dataset where we go from Summer(Real image) ----> Winter (Generated image) ----> Summer (Reconstructed image) (i.e. A ----> B ----> A')

image

The epochs, batch size, number of samples and all in all number of iterations mentioned in the code are memory intensive, i.e. for quick training and understanding of th neural networks.
Here to obtain the above depicted results in training, I have use approximately 7200 iterations to get the generated image and reconstructed image this good. The output may vary based upon the conditions and user requirements. (with 12000 iterations or more the models gives exact scene transfer given the presently written hyperparameters in the code.)

The reconstruction seems a bit wobbly/out of place in training as compared to that in test data since I changed numerous parameters in the code to get the desired domain adaptation. Perhaps further training with lower gradient descent can amend the reconstructed images to be more close to the real ones.

About

Regularized and extrapolative season-to-season transfer for scene adaptation

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published