This is an implementation of the paper: A Bayesian Decision Tree Algorithm by Nuti et al.
This package implements:
- Classification (binary and multiclass)
- Regression
- Both models are available in two versions respectively:
-
Perpendicular Trees: The classic decision/regression tree structure with splits along a single feature dimension (i.e., perpendicular to a feature dimension axis), analogous to e.g. the scikit-learn decision and regression trees.
The models are called
PerpendicularClassificationTree
andPerpendicularRegressionTree
. -
Hyperplane Trees: Decision/regression trees using arbitrarily-oriented hyperplanes. These models are more flexible than perpendicular trees as they cover a much larger search space to naturally make use of correlations between features.
All else equal, hyperplane trees typically lead to shallower trees with fewer leaf nodes compared to their perpendicular counterparts because they can employ more than just a single feature dimension per split. This can lead to less overfitting and better generalization performance, but no such guarantees exist because hyperplane trees are still being constructed in a greedy manner.
Note that hyperplane trees take much longer to train and need to be trained stochastically using global optimizers due to the exponentially large search space.
The models are called
HyperplaneClassificationTree
andHyperplaneRegressionTree
.
-
To install you can either use conda or pip:
git clone https://github.com/UBS-IB/bayesian_tree
cd bayesian_tree
conda build conda.recipe
conda install --use-local bayesian_decision_tree
git clone https://github.com/UBS-IB/bayesian_tree
cd bayesian_tree
pip install -e .
We include some examples for various uses in the examples directory. The models are fully compatible with scikit-learn, so you can use them for e.g. cross-validation or performance evaluation using scikit-learn functions.
- Add parallelization option (dask)