Releases: google/yggdrasil-decision-forests
Releases · google/yggdrasil-decision-forests
1.0.0
Yggdrasil Decision Forests 1.0.0
With this release, Yggdrasil Decision Forests finally reaches its first major release 1.0.0 🥳
With this milestone we want to communicate more broadly that Yggdrasil Decision Forests has become a more stable and mature library. In particular, we established more comprehensive testing to make sure that YDF is ready for professional environments.
Features
- Go (GoLang) inference API (Beta): simple engine written in Go to do
inference on YDF and TF-DF models. - Creation of html evaluation report with plots (e.g., ROC, PR-ROC).
- Add support for Random Forest, CART, regressive GBT and Ranking GBT models
in the Go API. - Add customization of the number of IO threads in the deployment proto.
1.0.0rc0
0.2.5
Features
- Multi-threading of the oblique splitter for gradient boosted tree models.
- Support for Javascript + WebAssembly inference of model.
- Support for pure serving model i.e. model containing only serving data.
- Add "edit_model" cli tool.
Fix
- Remove bias toward low outcome in uplift modeling.
Javascript + WebAssembly RC1 for YDF v0.2.5
Pre-compiled binary for the Javascript + WebAssembly inference library.
Compiled with:
bazel build -c opt --config=lto --config=size --config=wasm //yggdrasil_decision_forests/port/javascript:create_release
0.2.4
Features
- Discard hessian splits with score lower than the parents. This change has
little effect on the model quality, but it can reduce its size. - Add internal flag
hessian_split_score_subtract_parent
to subtract the
parent score in the computation of an hessian split score. - Add the hyper-parameter optimizer as one of the meta-learner.
- The Random Forest and CART learners support the
NUMERICAL_UPLIFT
task.
0.2.3
Features
- Honest Random Forests (also work with Gradient Boosted Tree and CART).
- Can train Random Forests with example sampling without replacement.
- Add support for Focal Loss in Gradient Boosted Tree learner.
Fixes
- Incorrect default evaluation of categorical split with uplift tasks. This
was making uplift models with missing categorical values perform worst, and
made the inference of uplift model possibly slower.
0.2.2
Features
- The CART learner exports the number of pruned nodes in the output model
meta-data. Note: The CART learner outputs a Random Forest model with a
single tree. - The Random Forest and CART learners support the
CATEGORICAL_UPLIFT
task. - Add
SetLoggingLevel
to control the amount of logging.
Fixes
- Fix tree pruning in the CART learner for regressive tasks.