-
-
Notifications
You must be signed in to change notification settings - Fork 612
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
More robust testing for AlphaDropout #1851
Comments
I want to work on this issue, could you please provide me with a bit more info? |
The best place to start would be the paper itself, https://arxiv.org/abs/1706.02515. The section "New Dropout Technique" talks about Alpha dropout and some of its unique properties. Testing for those would be a great way to ensure we're following the spec. |
Well, that's a teeny-tiny paper, lemme read the required section and I'll get back to the tests :) |
I would love to also work on this issue, read the required parts. Could I get some more reference for this |
Cool will start working on it as well as soon as I get done with the contrastive loss function. |
We currently borrow from https://github.com/pytorch/pytorch/blob/v1.10.0/test/cpp/api/modules.cpp#L1337-L1338, but some sort of goodness-of-fit test would be more robust than a simple range. Sampling the layer outputs should have a negligible impact on test runtime.
The text was updated successfully, but these errors were encountered: