Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

More robust testing for AlphaDropout #1851

Open
ToucheSir opened this issue Jan 27, 2022 · 6 comments
Open

More robust testing for AlphaDropout #1851

ToucheSir opened this issue Jan 27, 2022 · 6 comments

Comments

@ToucheSir
Copy link
Member

We currently borrow from https://github.com/pytorch/pytorch/blob/v1.10.0/test/cpp/api/modules.cpp#L1337-L1338, but some sort of goodness-of-fit test would be more robust than a simple range. Sampling the layer outputs should have a negligible impact on test runtime.

@aritropc
Copy link

aritropc commented Feb 7, 2022

I want to work on this issue, could you please provide me with a bit more info?

@ToucheSir
Copy link
Member Author

The best place to start would be the paper itself, https://arxiv.org/abs/1706.02515. The section "New Dropout Technique" talks about Alpha dropout and some of its unique properties. Testing for those would be a great way to ensure we're following the spec.

@aritropc
Copy link

aritropc commented Feb 7, 2022

Well, that's a teeny-tiny paper, lemme read the required section and I'll get back to the tests :)

@arcAman07
Copy link
Contributor

arcAman07 commented Feb 19, 2022

I would love to also work on this issue, read the required parts. Could I get some more reference for this

@darsnack
Copy link
Member

I have highlighted the appropriate sections in the text below. We want to sample many x values and test whether the mean and variance after dropout matches the paper.
image

@arcAman07
Copy link
Contributor

Cool will start working on it as well as soon as I get done with the contrastive loss function.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants