You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
PerpetualBooster is a new gradient boosting machine (GBM) algorithm which doesn't need hyperparameter optimization unlike other GBM algorithms.
It includes a budget parameter which can be tweaked to optimise the search.
From their results, it seems to be faster for training/inference against a tuned LightGBM with similar performance, and mostly better than AutoGluon across 10 datasets.
Description
PerpetualBooster is a new gradient boosting machine (GBM) algorithm which doesn't need hyperparameter optimization unlike other GBM algorithms.
It includes a
budget
parameter which can be tweaked to optimise the search.From their results, it seems to be faster for training/inference against a tuned LightGBM with similar performance, and mostly better than AutoGluon across 10 datasets.
https://github.com/perpetual-ml/perpetual
Use case
I'd be keen to test this as a drop-in replacement to any other GBM model and integrate it to nixtla MLforecast natively.
The text was updated successfully, but these errors were encountered: