Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Metrics] R2Score #5241

Merged
merged 15 commits into from
Jan 1, 2021

Conversation

SkafteNicki
Copy link
Member

What does this PR do?

Adds R2Score metric similar to sklearns (https://scikit-learn.org/stable/modules/generated/sklearn.metrics.r2_score.html)

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together? Otherwise, we ask you to create a separate PR for every change.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified; Bugfixes should be including in bug-fix release milestones (m.f.X) and features should be included in (m.X.b) releases.

Did you have fun?

Make sure you had fun coding 🙃

@pep8speaks
Copy link

pep8speaks commented Dec 23, 2020

Hello @SkafteNicki! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2020-12-31 14:28:20 UTC

@rohitgr7
Copy link
Contributor

NICE!!
how about adding adjusted r2 too? or maybe extending it with additional arguments required for adjusted r2. Won't be so trivial I guess.

Copy link
Contributor

@teddykoker teddykoker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM :)

tests/metrics/regression/test_r2score.py Outdated Show resolved Hide resolved
tests/metrics/regression/test_r2score.py Outdated Show resolved Hide resolved
tests/metrics/regression/test_r2score.py Outdated Show resolved Hide resolved
Co-authored-by: Teddy Koker <teddy.koker@gmail.com>
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
@codecov
Copy link

codecov bot commented Dec 23, 2020

Codecov Report

Merging #5241 (0218f5d) into release/1.2-dev (73e06fd) will increase coverage by 0%.
The diff coverage is 94%.

@@               Coverage Diff               @@
##           release/1.2-dev   #5241   +/-   ##
===============================================
  Coverage               93%     93%           
===============================================
  Files                  144     146    +2     
  Lines                10146   10211   +65     
===============================================
+ Hits                  9425    9486   +61     
- Misses                 721     725    +4     

Copy link
Contributor

@gianscarpe gianscarpe left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a suggestion, let me know :)

@SkafteNicki
Copy link
Member Author

@rohitgr7 good point about adjusted r2 score.
I have added an adjusted arg that corresponds to the number of independent regressors that the user needs to provide. If provided the metric returns the adjusted score.

Copy link
Contributor

@gianscarpe gianscarpe left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm! :)

pytorch_lightning/metrics/functional/r2score.py Outdated Show resolved Hide resolved
pytorch_lightning/metrics/functional/r2score.py Outdated Show resolved Hide resolved
adjusted: number of independent regressors for calculating adjusted r2 score.
Default 0 (standard r2 score).
multioutput: Defines aggregation in the case of multiple output scores. Can be one
of the following strings (default is `'uniform_average'`.):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
of the following strings (default is `'uniform_average'`.):
of the following strings (default is ``'uniform_average'``.):

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should defaults be removed from the docs??

pytorch_lightning/metrics/functional/r2score.py Outdated Show resolved Hide resolved
pytorch_lightning/metrics/regression/r2score.py Outdated Show resolved Hide resolved
pytorch_lightning/metrics/functional/r2score.py Outdated Show resolved Hide resolved
@SkafteNicki SkafteNicki merged commit 9dbdffc into Lightning-AI:release/1.2-dev Jan 1, 2021
@bibinwils
Copy link

https://forums.pytorchlightning.ai/t/to-find-r2score-of-my-model/764 could you please, anyone help in this?
I have a UNet model. I’m trying for a regression model since, in my output, I have different floating values for each pixel. In order to check the r2score, I tried to put the below code in the ‘model class’, training_step, validation_step, and test_step.

r2 = r2score(logits, y)
self.log('r2:',r2)

But it’s giving the following error

ValueError: Expected both prediction and target to be 1D or 2D tensors, but recevied tensors with dimension torch.Size([50, 1, 32, 32])
How can I check my model fit?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants