-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Minimal Variance Sampling in Stochastic Gradient Boosting #2644
Comments
Closed in favor of being in #2302. We decided to keep all feature requests in one place. Welcome to contribute this feature! Please re-open this issue (or post a comment if you are not a topic starter) if you are actively working on implementing this feature. |
@guolinke Exciting! What do you think about borrowing that code with referring to the author. I guess if the author wished to contribute to the upstream repo, it was enough time to do it from the latest commits... |
Uhhh, just noticed that this is one of the authors of MVS! |
Closing according to #5091 (comment). Welcome to contribute this feature! Feel free to fork https://github.com/microsoft/LightGBM/tree/mvs_dev branch. |
Summary
MVS can be considered as an improved version of the Gradient-based One-Side Sampling (GOSS, see details in the paper) implemented in LightGBM, which samples a given number of top examples by values |gi| with the probability 1 and samples other examples with the same fixed probability. Due to the theoretical basis, MVS provides a lower variance of the estimate Eg than GOSS.
References
Docs:
Code:
NeurIPS 2019 Poster:
NeurIPS 2019 Paper:
The text was updated successfully, but these errors were encountered: