Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FixedBatchNormalization issue #9

Open
rohith513 opened this issue Jul 19, 2019 · 0 comments
Open

FixedBatchNormalization issue #9

rohith513 opened this issue Jul 19, 2019 · 0 comments

Comments

@rohith513
Copy link

Hello, thank you for adding validation part. Can you tell me why you used FixedbatchNormalization instead of BatchNormalization in ResNet?
I want to freeze earlier layers in ResNet and train the rest of the model. I am not sure how this custom batchnorm would behave in this case as I am freezing batchnorm layers too.
During training, validation loss is high and classifier accuracy decreases. I guess this has something to do with batchnorm.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant