Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added Clipping constraint and KLD loss with CD #180

Merged
merged 11 commits into from
Nov 11, 2021

Conversation

Aakanksha-Rana
Copy link
Member

@Aakanksha-Rana Aakanksha-Rana commented Oct 14, 2021

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)

Summary

This code improves the concrete dropout code by adding the KL loss approximation and also adds a clipping constraint for the value of p, so that it doesn't fall to 0.

Checklist

  • I have added tests to cover my changes
  • I have updated documentation (if necessary)

Acknowledgment

  • I acknowledge that this contribution will be available under the Apache 2 license.

@Aakanksha-Rana
Copy link
Member Author

@satra this looks like unusual to me. is this a problem with the hook? Error doesn't seem to appear from the code.

@satra
Copy link
Contributor

satra commented Oct 22, 2021

it looks like black is modifying some file: https://results.pre-commit.ci/run/github/94588639/1634850114.qOu0gxTiQIyeYIEiVOoi6g - you can run pre-commit locally on the exact branch after fetching from github to see what is changing.

@satra satra merged commit 841a1b9 into neuronets:master Nov 11, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants