Skip to content

Commit

Permalink
fix rebase.
Browse files Browse the repository at this point in the history
  • Loading branch information
trivialfis committed Mar 10, 2023
1 parent 31cf9e8 commit 543002c
Show file tree
Hide file tree
Showing 6 changed files with 14 additions and 17 deletions.
6 changes: 3 additions & 3 deletions demo/guide-python/multioutput_regression.py
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@ def rmse(predt: np.ndarray, dtrain: xgb.DMatrix) -> Tuple[str, float]:
{
"tree_method": "hist",
"num_target": y.shape[1],
"multi_strategy": "mono",
"multi_strategy": "monolithic",
},
dtrain=Xy,
num_boost_round=128,
Expand All @@ -116,8 +116,8 @@ def rmse(predt: np.ndarray, dtrain: xgb.DMatrix) -> Tuple[str, float]:
args = parser.parse_args()
# Train with builtin RMSE objective
# one model per output
rmse_model(args.plot == 1, "compo")
rmse_model(args.plot == 1, "composite")
# one model for all outputs
rmse_model(args.plot == 1, "mono")
rmse_model(args.plot == 1, "monolithic")
# Train with custom objective.
custom_rmse_model(args.plot == 1)
6 changes: 3 additions & 3 deletions doc/parameter.rst
Original file line number Diff line number Diff line change
Expand Up @@ -226,12 +226,12 @@ Parameters for Tree Booster
list is a group of indices of features that are allowed to interact with each other.
See :doc:`/tutorials/feature_interaction_constraint` for more information.

* ``multi_strategy``, [default = ``compo``]
* ``multi_strategy``, [default = ``composite``]

- The strategy used for training multi-target models.

- ``compo``: One model for each target.
- ``mono``: Use multi-target trees.
- ``composite``: One model for each target.
- ``monolithic``: Use multi-target trees.

.. _cat-param:

Expand Down
6 changes: 3 additions & 3 deletions doc/tutorials/multioutput.rst
Original file line number Diff line number Diff line change
Expand Up @@ -54,12 +54,12 @@ Training with Vector Leaf

XGBoost can optionally build multi-output trees with the size of leaf equals to the number
of targets. The behavior can be controlled by the ``multi_strategy`` training
parameter. It can take the value `compo` (the default) or `mono`. Specify `mono` and use
``tree_method=hist`` to enable this feature.
parameter. It can take the value `composite` (the default) or `monolithic`. Specify
`monolithic` and use ``tree_method=hist`` to enable this feature.


.. code-black:: python

clf = xgb.XGBClassifier(tree_method="hist", multi_strategy="mono")
clf = xgb.XGBClassifier(tree_method="hist", multi_strategy="monolithic")

See :ref:`sphx_glr_python_examples_multioutput_regression.py` for a worked example.
2 changes: 2 additions & 0 deletions python-package/xgboost/sklearn.py
Original file line number Diff line number Diff line change
Expand Up @@ -624,6 +624,7 @@ def __init__(
feature_types: Optional[FeatureTypes] = None,
max_cat_to_onehot: Optional[int] = None,
max_cat_threshold: Optional[int] = None,
multi_strategy: Optional[str] = None,
eval_metric: Optional[Union[str, List[str], Callable]] = None,
early_stopping_rounds: Optional[int] = None,
callbacks: Optional[List[TrainingCallback]] = None,
Expand Down Expand Up @@ -670,6 +671,7 @@ def __init__(
self.feature_types = feature_types
self.max_cat_to_onehot = max_cat_to_onehot
self.max_cat_threshold = max_cat_threshold
self.multi_strategy = multi_strategy
self.eval_metric = eval_metric
self.early_stopping_rounds = early_stopping_rounds
self.callbacks = callbacks
Expand Down
5 changes: 0 additions & 5 deletions src/learner.cc
Original file line number Diff line number Diff line change
Expand Up @@ -67,11 +67,6 @@ const char* kMaxDeltaStepDefaultValue = "0.7";

DECLARE_FIELD_ENUM_CLASS(xgboost::MultiStrategy);

namespace xgboost {
std::string StrategyStr(Strategy s) { return s == Strategy::kComposite ? "compo" : "mono"; }
} // namespace xgboost
DECLARE_FIELD_ENUM_CLASS(xgboost::Strategy);

namespace xgboost {
Learner::~Learner() = default;
namespace {
Expand Down
6 changes: 3 additions & 3 deletions tests/cpp/predictor/test_gpu_predictor.cu
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ TEST(GPUPredictor, Basic) {

Context ctx;
ctx.gpu_id = 0;
LearnerModelParam mparam{MakeMP(n_col, .5, 1, ctx.gpu_id)};
LearnerModelParam mparam{MakeMP(n_col, .5, ctx.gpu_id)};
gbm::GBTreeModel model = CreateTestModel(&mparam, &ctx);

// Test predict batch
Expand Down Expand Up @@ -150,7 +150,7 @@ TEST(GPUPredictor, ShapStump) {

Context ctx;
ctx.gpu_id = 0;
LearnerModelParam mparam{MakeMP(1, .5, 1, 1, ctx.gpu_id)};
LearnerModelParam mparam{MakeMP(1, .5, 1, ctx.gpu_id)};
gbm::GBTreeModel model(&mparam, &ctx);

std::vector<std::unique_ptr<RegTree>> trees;
Expand All @@ -177,7 +177,7 @@ TEST(GPUPredictor, ShapStump) {
TEST(GPUPredictor, Shap) {
Context ctx;
ctx.gpu_id = 0;
LearnerModelParam mparam{MakeMP(1, .5, 1, 1, ctx.gpu_id)};
LearnerModelParam mparam{MakeMP(1, .5, 1, ctx.gpu_id)};
gbm::GBTreeModel model(&mparam, &ctx);

std::vector<std::unique_ptr<RegTree>> trees;
Expand Down

0 comments on commit 543002c

Please sign in to comment.