Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cycle gan g_loss and d_loss become nan after some epochs during training #32

Open
Ayesha-Rafiq opened this issue May 15, 2018 · 3 comments

Comments

@Ayesha-Rafiq
Copy link

Ayesha-Rafiq commented May 15, 2018

Generator loss and Discriminator loss become nan after some epochs and cycle gan start to generate black images.

@Auth0rM0rgan
Copy link

Check your optimizer section, It may be related to set something wrong in optimizer.

@hala3
Copy link

hala3 commented Jun 13, 2019

i want to change the loss function by pixel wise loss , how i can do that please

@nagaswethar
Copy link

self.g_loss_a2b = self.criterionGAN(self.DB_fake, tf.ones_like(self.DB_fake))
+ self.L1_lambda * abs_criterion(self.real_A, self.fake_A_)
+ self.L1_lambda * abs_criterion(self.real_B, self.fake_B_)
+ self.Lg_lambda * gradloss_criterion(self.real_A, self.fake_B, self.weighted_seg_A)
+ self.Lg_lambda * gradloss_criterion(self.real_B, self.fake_A, self.weighted_seg_B)
self.g_loss_b2a = self.criterionGAN(self.DA_fake, tf.ones_like(self.DA_fake))
+ self.L1_lambda * abs_criterion(self.real_A, self.fake_A_)
+ self.L1_lambda * abs_criterion(self.real_B, self.fake_B_)
+ self.Lg_lambda * gradloss_criterion(self.real_A, self.fake_B, self.weighted_seg_A)
+ self.Lg_lambda * gradloss_criterion(self.real_B, self.fake_A, self.weighted_seg_B)

can any one explain this loss function and what \ indicates in code

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants