Skip to content

Releases: google/yggdrasil-decision-forests

1.0.0

07 Sep 16:18
Compare
Choose a tag to compare

Yggdrasil Decision Forests 1.0.0

With this release, Yggdrasil Decision Forests finally reaches its first major release 1.0.0 🥳

With this milestone we want to communicate more broadly that Yggdrasil Decision Forests has become a more stable and mature library. In particular, we established more comprehensive testing to make sure that YDF is ready for professional environments.

Features

  • Go (GoLang) inference API (Beta): simple engine written in Go to do
    inference on YDF and TF-DF models.
  • Creation of html evaluation report with plots (e.g., ROC, PR-ROC).
  • Add support for Random Forest, CART, regressive GBT and Ranking GBT models
    in the Go API.
  • Add customization of the number of IO threads in the deployment proto.

1.0.0rc0

26 Aug 13:47
Compare
Choose a tag to compare
1.0.0rc0 Pre-release
Pre-release

Fix

  • Improved documentation.

0.2.5

17 Jul 11:22
Compare
Choose a tag to compare

Features

  • Multi-threading of the oblique splitter for gradient boosted tree models.
  • Support for Javascript + WebAssembly inference of model.
  • Support for pure serving model i.e. model containing only serving data.
  • Add "edit_model" cli tool.

Fix

  • Remove bias toward low outcome in uplift modeling.

Javascript + WebAssembly RC1 for YDF v0.2.5

11 Jul 17:44
Compare
Choose a tag to compare

Pre-compiled binary for the Javascript + WebAssembly inference library.

Compiled with:

bazel build -c opt --config=lto --config=size --config=wasm //yggdrasil_decision_forests/port/javascript:create_release

0.2.4

19 May 19:37
Compare
Choose a tag to compare

Features

  • Discard hessian splits with score lower than the parents. This change has
    little effect on the model quality, but it can reduce its size.
  • Add internal flag hessian_split_score_subtract_parent to subtract the
    parent score in the computation of an hessian split score.
  • Add the hyper-parameter optimizer as one of the meta-learner.
  • The Random Forest and CART learners support the NUMERICAL_UPLIFT task.

0.2.3

27 Jan 19:24
Compare
Choose a tag to compare

Features

  • Honest Random Forests (also work with Gradient Boosted Tree and CART).
  • Can train Random Forests with example sampling without replacement.
  • Add support for Focal Loss in Gradient Boosted Tree learner.

Fixes

  • Incorrect default evaluation of categorical split with uplift tasks. This
    was making uplift models with missing categorical values perform worst, and
    made the inference of uplift model possibly slower.

0.2.2

15 Dec 17:12
Compare
Choose a tag to compare

Features

  • The CART learner exports the number of pruned nodes in the output model
    meta-data. Note: The CART learner outputs a Random Forest model with a
    single tree.
  • The Random Forest and CART learners support the CATEGORICAL_UPLIFT task.
  • Add SetLoggingLevel to control the amount of logging.

Fixes

  • Fix tree pruning in the CART learner for regressive tasks.

0.2.0

01 Nov 16:44
Compare
Choose a tag to compare

Features

  • Distributed training of Gradient Boosted Decision Trees.
  • Add maximum_model_size_in_memory_in_bytes hyper-parameter to limit the
    size of the model in memory.

Fixes

  • Fix invalid splitting of pre-sorted numerical features (make use to use
    midpoint).

0.1.3

19 May 09:42
e2c9a49
Compare
Choose a tag to compare

Features

  • Register new inference engines.

0.1.2

18 May 17:23
Compare
Choose a tag to compare

Features

  • Inference engines: QuickScorer Extended and Pred