Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use my own image finds at the process of pretrainMain function the loss is nan #15

Open
Darwin84 opened this issue Feb 9, 2017 · 0 comments

Comments

@Darwin84
Copy link

Darwin84 commented Feb 9, 2017

hi,

when i use my own images to pretrain the encodes , i found that at the process of pretrainMain function after first 1000 iteration the loss become nan for the first layer training.

At the fist layer pretrain the net includes all the encoder layers and the last layer of decoder, then the loss will be the euclidean distance between input data and the output of decoder.

Is this loss ok?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant