-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Average precision score #3347
Average precision score #3347
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could you please add this metric to the list we keep on the R side? https://github.com/microsoft/LightGBM/blob/8fc80bb487c6db76464a28888725d344819b56f9/R-package/R/metrics.R
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good to me from the R side! I think another reviewer should approve before this is merged though.
@btrotta in case you aren't aware of the practice...I just added the this is done automatically based on tags by https://github.com/microsoft/LightGBM/blob/master/.github/release-drafter.yml |
@jameslamb Thanks! I didn't realize the tags were used for the release notes. |
no problem! |
src/metric/binary_metric.hpp
Outdated
} | ||
|
||
void Init(const Metadata& metadata, data_size_t num_data) override { | ||
name_.emplace_back("ap"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@btrotta
Was name left ap
(instead of average_precision
, #3347 (comment)) intentionally?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oops, this was a mistake, fixed now.
This pull request has been automatically locked since there has not been any recent activity since it was closed. To start a new related discussion, open a new issue at https://github.com/microsoft/LightGBM/issues including a reference to this. |
Implement average precision score for binary classification. This is very similar to the precision-recall AUC requested in #3026, but it does not interpolate the points of the curve, since this is not recommended for PR curves (see https://scikit-learn.org/stable/modules/generated/sklearn.metrics.average_precision_score.html).