-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Metrics] class based embedding similarity + tests #3358
[Metrics] class based embedding similarity + tests #3358
Conversation
Codecov Report
@@ Coverage Diff @@
## master #3358 +/- ##
========================================
- Coverage 90% 78% -13%
========================================
Files 93 111 +18
Lines 8188 10286 +2098
========================================
+ Hits 7397 7990 +593
- Misses 791 2296 +1505 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm, just some whitespace that needs to be fixed for proper display in html
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Hello @SkafteNicki! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2020-09-11 06:10:39 UTC |
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
pytorch_lightning/metrics/metric.py
Outdated
@@ -287,3 +287,6 @@ def output_convert(self, data: Any, output: Any): | |||
def ddp_sync(self, data: Any, output: Any): | |||
return apply_to_collection(output, torch.Tensor, sync_ddp_if_available, | |||
self.reduce_group, self.reduce_op) | |||
|
|||
|
|||
__all__ = ["Metric", "TensorMetric", "NumpyMetric"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is not a very common place for __all__
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
codefactor also complains about this, not sure why. I think it should be fine though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the point is that it is very related to importing from packages when you do not want to import all functions
https://stackoverflow.com/questions/44834/can-someone-explain-all-in-python
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree that it is not a common place for __all__
, normally I would put it at the top of the file, but then codefactor complains about it. But I can move it to the top if that is better.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, it is not very common to have it in other files than __init__
so was there a reason to move it from the init?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was changed due to a comment for justus at some point, but lets change it back since it is very uncommon practise
What does this PR do?
PR #3349 added a new functional metric
embedding_similarity
. This PR just follows up and adds a class based interface + adds tests against sklearnBefore submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃