Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

first metric only in earlystopping for cli #2172

Merged
merged 4 commits into from
May 16, 2019
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions docs/Parameters.rst
Original file line number Diff line number Diff line change
Expand Up @@ -240,6 +240,12 @@ Learning Control Parameters

- ``<= 0`` means disable

- ``first_metric_only`` :raw-html:`<a id="first_metric_only" title="Permalink to this parameter" href="#first_metric_only">&#x1F517;&#xFE0E;</a>`, default = ``false``, type = bool

- set this to ``true``, if you want to use only the first metric for early stopping

- **Note**: can be used only in CLI version

- ``max_delta_step`` :raw-html:`<a id="max_delta_step" title="Permalink to this parameter" href="#max_delta_step">&#x1F517;&#xFE0E;</a>`, default = ``0.0``, type = double, aliases: ``max_tree_output``, ``max_leaf_output``

- used to limit the max output of tree leaves
Expand Down
4 changes: 4 additions & 0 deletions include/LightGBM/config.h
Original file line number Diff line number Diff line change
Expand Up @@ -260,6 +260,10 @@ struct Config {
// desc = ``<= 0`` means disable
int early_stopping_round = 0;

// desc = set this to ``true``, if you want to use only the first metric for early stopping
// desc = **Note**: can be used only in CLI version
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Although this parameter is actually used in cli version, there is another parameter with the same name in python/R package. Therefore, I think the note is not needed, as well as the early_stopping_round.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's a little bit another situation (in Python, at least).
early_stopping_round is a parameter for train/cv, so everything is OK, but first_metric_only is a parameter for callback.

def early_stopping(stopping_rounds, first_metric_only=False, verbose=True):

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

okay, I think the first_metric_only should be in train/cv as well, like the early_stopping_round:

if early_stopping_rounds is not None:
callbacks.add(callback.early_stopping(early_stopping_rounds, verbose=False))

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you think that we should try to retrieve first_metric_only from params and pass to callback?

if early_stopping_rounds is not None:
callbacks.add(callback.early_stopping(early_stopping_rounds, verbose=bool(verbose_eval)))

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, just like the early stopping rounds:

for alias in ["early_stopping_round", "early_stopping_rounds", "early_stopping"]:
if alias in params and params[alias] is not None:
early_stopping_rounds = int(params.pop(alias))
warnings.warn("Found `{}` in params. Will use it instead of argument".format(alias))
break

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it! Then I'll create a PR for this and remove my note.

bool first_metric_only = false;

// alias = max_tree_output, max_leaf_output
// desc = used to limit the max output of tree leaves
// desc = ``<= 0`` means no constraint
Expand Down
21 changes: 11 additions & 10 deletions src/boosting/gbdt.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ GBDT::GBDT() : iter_(0),
train_data_(nullptr),
objective_function_(nullptr),
early_stopping_round_(0),
es_first_metric_only_(false),
max_feature_idx_(0),
num_tree_per_iteration_(1),
num_class_(1),
Expand Down Expand Up @@ -51,6 +52,7 @@ void GBDT::Init(const Config* config, const Dataset* train_data, const Objective
num_class_ = config->num_class;
config_ = std::unique_ptr<Config>(new Config(*config));
early_stopping_round_ = config_->early_stopping_round;
es_first_metric_only_ = config_->first_metric_only;
shrinkage_rate_ = config_->learning_rate;

std::string forced_splits_path = config->forcedsplits_filename;
Expand Down Expand Up @@ -129,20 +131,18 @@ void GBDT::AddValidDataset(const Dataset* valid_data,
}
valid_score_updater_.push_back(std::move(new_score_updater));
valid_metrics_.emplace_back();
if (early_stopping_round_ > 0) {
best_iter_.emplace_back();
best_score_.emplace_back();
best_msg_.emplace_back();
}
for (const auto& metric : valid_metrics) {
valid_metrics_.back().push_back(metric);
if (early_stopping_round_ > 0) {
best_iter_.back().push_back(0);
best_score_.back().push_back(kMinScore);
best_msg_.back().emplace_back();
}
}
valid_metrics_.back().shrink_to_fit();

if (early_stopping_round_ > 0) {
auto num_metrics = valid_metrics.size();
if (es_first_metric_only_) { num_metrics = 1; }
best_iter_.emplace_back(num_metrics, 0);
best_score_.emplace_back(num_metrics, kMinScore);
best_msg_.emplace_back(num_metrics);
}
}

void GBDT::Boosting() {
Expand Down Expand Up @@ -514,6 +514,7 @@ std::string GBDT::OutputMetric(int iter) {
msg_buf << tmp_buf.str() << '\n';
}
}
if (es_first_metric_only_ && j > 0) { continue; }
if (ret.empty() && early_stopping_round_ > 0) {
auto cur_score = valid_metrics_[i][j]->factor_to_bigger_better() * test_scores.back();
if (cur_score > best_score_[i][j]) {
Expand Down
2 changes: 2 additions & 0 deletions src/boosting/gbdt.h
Original file line number Diff line number Diff line change
Expand Up @@ -434,6 +434,8 @@ class GBDT : public GBDTBase {
std::vector<std::vector<const Metric*>> valid_metrics_;
/*! \brief Number of rounds for early stopping */
int early_stopping_round_;
/*! \brief Only use first metric for early stopping */
bool es_first_metric_only_;
/*! \brief Best iteration(s) for early stopping */
std::vector<std::vector<int>> best_iter_;
/*! \brief Best score(s) for early stopping */
Expand Down
4 changes: 4 additions & 0 deletions src/io/config_auto.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -181,6 +181,7 @@ std::unordered_set<std::string> Config::parameter_set({
"feature_fraction",
"feature_fraction_seed",
"early_stopping_round",
"first_metric_only",
"max_delta_step",
"lambda_l1",
"lambda_l2",
Expand Down Expand Up @@ -312,6 +313,8 @@ void Config::GetMembersFromString(const std::unordered_map<std::string, std::str

GetInt(params, "early_stopping_round", &early_stopping_round);

GetBool(params, "first_metric_only", &first_metric_only);

GetDouble(params, "max_delta_step", &max_delta_step);

GetDouble(params, "lambda_l1", &lambda_l1);
Expand Down Expand Up @@ -556,6 +559,7 @@ std::string Config::SaveMembersToString() const {
str_buf << "[feature_fraction: " << feature_fraction << "]\n";
str_buf << "[feature_fraction_seed: " << feature_fraction_seed << "]\n";
str_buf << "[early_stopping_round: " << early_stopping_round << "]\n";
str_buf << "[first_metric_only: " << first_metric_only << "]\n";
str_buf << "[max_delta_step: " << max_delta_step << "]\n";
str_buf << "[lambda_l1: " << lambda_l1 << "]\n";
str_buf << "[lambda_l2: " << lambda_l2 << "]\n";
Expand Down