-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Features/mean var merge cleanup #445
Conversation
Codecov Report
@@ Coverage Diff @@
## master #445 +/- ##
=========================================
Coverage ? 96.68%
=========================================
Files ? 57
Lines ? 12134
Branches ? 0
=========================================
Hits ? 11732
Misses ? 402
Partials ? 0
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some tiny remarks on tinier things I noted
var_tot[0, 0] = merged[0] | ||
var_tot[0, 1] = merged[1] | ||
var_tot[0, 2] = merged[2] | ||
|
||
return var_tot[0][0] | ||
|
||
else: # axis is given |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Small inconsistency: When axis is set, integer tensors are not supported any longer. Numpy does support ints and floats, while pytorch only supports floats in general. This behaviour is also in 'mean'.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
because torch does not support ints so we dont either. if its needed, the easiest way to do this would be to cast the input to a float. I dont think that we need to do this though. i dont think that we need to be so religious to numpy
Would you be so kind to remove the failing test? As of today, torch.logical_xor allows non-bool tensors in the new pytorch version. |
yes this is an issue that needs to be addressed. are you working on this? Bjoern wants to put a release out today. I am in a meeting at the moment, but once this is done I will get to work on a fix for the latest version of pytorch |
should the fix be to use the bitwise xor? |
I will make a pull request |
i think that Bessel's correction should be the default. But this is mostly personal preference. I think that for general statistical methods (as well as more advanced methods) it will provide more statistically reliable results |
Description
This is a cleanup of the mean and var functions. The moment merging function is rewritten to be used by both mean and var (it is also possible to expand this function to be used for higher order moments).
The var function also supports multi-dimensional axis arguments (the algorithm is the same as it is for mean)
Fixes: #156
Changes proposed:
Type of change
Select relevant options.
Are all split configurations tested and accounted for?
Does this change require a documentation update outside of the changes proposed?
Does this change modify the behaviour of other functions?
Are there code practices which require justification?