Skip to content

Commit

Permalink
[doc] better doc for keep_training_booster (#3275)
Browse files Browse the repository at this point in the history
* [doc] better doc for `keep_training_booster`

* Update python-package/lightgbm/engine.py

Co-authored-by: Nikita Titov <nekit94-08@mail.ru>

Co-authored-by: Nikita Titov <nekit94-08@mail.ru>
  • Loading branch information
guolinke and StrikerRUS authored Aug 6, 2020
1 parent 9b26373 commit 6f54ec3
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions python-package/lightgbm/engine.py
Original file line number Diff line number Diff line change
Expand Up @@ -128,6 +128,7 @@ def train(params, train_set, num_boost_round=100,
keep_training_booster : bool, optional (default=False)
Whether the returned Booster will be used to keep training.
If False, the returned value will be converted into _InnerPredictor before returning.
When your model is very large and cause the memory error, you can try to set this param to ``True`` to avoid the model conversion performed during the internal call of ``model_to_string``.
You can still use _InnerPredictor as ``init_model`` for future continue training.
callbacks : list of callables or None, optional (default=None)
List of callback functions that are applied at each iteration.
Expand Down

0 comments on commit 6f54ec3

Please sign in to comment.