-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Probabilistic Forecasting #3200
Comments
Seconded - I think this would make LightGBM incredibly more useful for the same regression problems it is used to tackle now, as well as additional problems that require a more probabilistic approach. Some of the ngboost team's ideas for next steps, on predicting joint probability distributions, as they mention in their slides ( https://drive.google.com/file/d/183BWFAdFms81MKy6hSku8qI97OwS_JH_/view ), are particularly interesting as well: Demonstrate use for joint-outcomes regression (e.g. ”what’s the probability that it rains >3 inches and is >15C tomorrow?”) |
Thanks @MotoRZR for referring to my repo https://github.com/StatMixedML. In fact, I am currently also working on an extension of LightGBM to probabilistic forecasting, see the repo here https://github.com/StatMixedML/LightGBMLSS |
This would be a wonderful addition. FWIW - Catboost has recently rolled out support for something like this as well, in version 0.24 via |
Thanks to GitHub we can find the corresponding commit: catboost/catboost@af88523. |
@kmedved Thanks for pointing towards the The fact that the RMSE is used as a loss function makes me doubt that it is truly a probabilistic approach. The reason is that, as splitting procedures that are internally used to construct trees can detect changes in the mean only, standard implementations of machine learning models are not able to recognize any distributional changes (e.g., change of |
That's a good note @StatMixedML, although I am actually not totally sure they're using plain RMSE as a loss function (despite the name). An explainer notebook is coming, but if you try it out, the validation loss on the model does not match (or even resemble RMSE). I've put together an example Colab notebook here., where on the CA housing dataset, without any tuning, the |
@kmedved Thanks for the interesting comparison! Indeed, it seems as if Anyways, I am not sure how one would evaluate
The first half, of course, relates to a probabilistic forecast, whereas the second half aims at point forecasts. |
Keep in mind that LightGBM already includes Quantile Regression. Even though it may not enjoy the probabilistic properties of a true probabilistic (let alone bayesian) forecast, it is still the most used method nowadays for variance forecast estimation; at least for aleatoric uncertainty. |
Quantile regression is fine if you're only interested in specific quantiles. If you want to the full distribution it's not as useful. Also, with quantile regression it's inefficient to have a model for each quantile with different sets of hyperparameters. With neural nets there are various ways to do this like: With GBDT not as many tools, only recently has NGBoost has come on the scene. Seems like StatMixedML also has something in the works. Quantile regression is ok, but not the magic bullet solution for the reasons mentioned. |
This article also points out some of the flaws of quantile regression. Not specifically LightGBM related but still relevant. |
@MotoRZR yes, I recall reading that article a while ago. IMO, the most interesting approach is the one where the parameters of a distribution are estimated (this is, the first one you mentioned two messages above). In fact, that distribution parameter estimation method is what Amazon uses in their DeepAR paper (which happens to be the default model in their AWS Forecaster service). However, I am not sure whether this should be added as a new objective. It is relatively easy to get it up and running with the existing API. MC dropout for boosted trees is something I have been thinking about. However, at least in neural nets MC dropout generates distributions that usually end up having low variance (and therefore narrow prediction intervals) compared to more conventional bayesian inference fitting methods. But perhaps it is worth exploring for GBDT... |
Closed in favor of being in #2302. We decided to keep all feature requests in one place. Welcome to contribute this feature! Please re-open this issue (or post a comment if you are not a topic starter) if you are actively working on implementing this feature. |
I have just released LightGBMLSS, which is an extension of LightGBM to probabilistic forecasting. It has a very similar functionality as XGBoostLSS which is also under current development. I hope this is helpful for bringing LightGBM to a probabilistic setting. Looking forward to your feedback! |
Can this be integrated into the main lightgbm package? Looks rather interesting. |
@onacrame Thanks for your interest in LightGBMLSS. In principle, it is possible to create a Pull-Request and integrate it into lightgbm itself. As of now, I am not planning to do this since there are still some additions I want to bring to LightGBMLSS. |
In our latest paper, we extend LightGBM to a probabilistic setting using Normalizing Flows. Hence, instead of assuming a parametric distribution, we approximate the conditional cumulative distribution function via a set of transformation, i.e., Normalizing Flows. You can find the paper on the repo https://github.com/StatMixedML/DGBM Yet, we are still struggling with the runtime. I have create an issue on the repo That also has a link to a more thorough description of where the computational bottleneck might be. Appreciate any support on this. |
This issue has been automatically locked since there has not been any recent activity since it was closed. |
This was locked in error, sorry. |
Hi Do we know if there is any possibility of including LightGBMLSS, or NGBoost like functionality within LightGBM? This would be a very powerful addition to what we have currently in lightGBM. |
We know that LightGBM currently supports quantile regression, which is great, However, quantile regression can be an inefficient way to gauge prediction uncertainty because a new model needs to be built for every quantile, and in theory each of those models may have their own set of optimal hyperparameters, which becomes unwieldy from a production standpoint if you're interested in multiple quantiles as you can end up with many models. One of the main issues with machine learning is point predictions, as businesses are often interested to know what the probability distribution is for a given prediction. There are various methods to do this with neural networks, and only recently there have been new ways to address this with tree based models.
The NGBoost library has attempted to do this as per below
https://stanfordmlgroup.github.io/projects/ngboost/
Additionally there has been a paper on adapting XGBoost to be able to do this (in a different manner) although the author has not yet posted an implementation.
https://github.com/StatMixedML/XGBoostLSS
Something to consider as a feature as it would make LightGBM infinitely more valuable in regression scenarios.
The text was updated successfully, but these errors were encountered: