Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix #351: Solution 1: enforce weights.shape = x.shape for tuple axis #953

Closed
Changes from 2 commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
240bc05
fix #351: Solution 1: enforce weights.shape = x.shape for tuple axis
Mystic-Slice Apr 3, 2022
116b9cb
Merge branch 'main' into RestrictWeightShape
ClaudiaComito Apr 25, 2022
885f686
Merge branch 'release/1.2.x', set `main` version extension to "dev"
ClaudiaComito Apr 27, 2022
e30d739
Update link to llnl MPI tutorial and merge branch 'release/1.2.x'
ClaudiaComito May 26, 2022
08d0dcb
Replace bug report MD template with form in view of further automation
ClaudiaComito Jun 1, 2022
545a694
Fix bug report file name
ClaudiaComito Jun 1, 2022
295351c
Update bug_report.yml
ClaudiaComito Jun 1, 2022
91f8190
Update bug_report.yml
ClaudiaComito Jun 1, 2022
ef94a9b
Update bug_report.yml
ClaudiaComito Jun 1, 2022
4a1fc1a
Update bug_report.yml
ClaudiaComito Jun 1, 2022
cb94ad1
Auto generated release notes and changelog (#974)
JuanPedroGHM Jun 1, 2022
231831e
Tutorial note about local and global printing (#972)
JuanPedroGHM Jun 1, 2022
c1ef707
Updated the tutorial document. (#977)
Sai-Suraj-27 Jun 1, 2022
ee0ff4d
Set write permissions for workflow
ClaudiaComito Jul 2, 2022
a1f0b1b
Update schedule
ClaudiaComito Jul 2, 2022
96506fa
Update schedule
ClaudiaComito Jul 2, 2022
385374a
Update schedule
ClaudiaComito Jul 2, 2022
bff6b2c
Move pytorch version file out of workflows dir
ClaudiaComito Jul 4, 2022
4f59d69
Update paths
ClaudiaComito Jul 4, 2022
55fff46
[pre-commit.ci] pre-commit autoupdate
pre-commit-ci[bot] Jul 5, 2022
4fe322b
Push pytorch release update to release/1.2.x branch, not main
ClaudiaComito Jul 5, 2022
a52156b
Update schedule
ClaudiaComito Jul 5, 2022
229e172
Bypass `on push` trigger
ClaudiaComito Jul 5, 2022
6510ea4
Update schedule
ClaudiaComito Jul 5, 2022
2a947a1
Fix condition syntax
ClaudiaComito Jul 5, 2022
6ccec82
Fix syntax
ClaudiaComito Jul 5, 2022
3cc0080
On push trigger workaround
ClaudiaComito Jul 5, 2022
6c487d6
Update schedule
ClaudiaComito Jul 5, 2022
05a2acd
Update schedule
ClaudiaComito Jul 5, 2022
2223ce8
Merge branch 'main' into pre-commit-ci-update-config
ClaudiaComito Jul 6, 2022
d39cbfe
Enable non-negative sample size
neosunhan Jul 15, 2022
105b905
Read `min` value directly from torch return object
neosunhan Jul 15, 2022
54ab969
Enable non-negative number of samples for `logspace`
neosunhan Jul 15, 2022
2dfd591
Add test for `logspace`
neosunhan Jul 18, 2022
99aef73
Merge pull request #995 from helmholtz-analytics/features/994-linspac…
Markus-Goetz Jul 20, 2022
92e64c4
Merge branch 'main' into bug/996-iinfo-finfo-min
mtar Jul 21, 2022
891a983
Merge pull request #997 from helmholtz-analytics/bug/996-iinfo-finfo-min
mtar Jul 21, 2022
7ca100d
Merge branch 'main' into pre-commit-ci-update-config
mtar Jul 21, 2022
15240b0
Merge pull request #984 from helmholtz-analytics/pre-commit-ci-update…
mtar Jul 21, 2022
c85562b
Add MPI version field to bug report template
ClaudiaComito Aug 22, 2022
52d88c0
fix: set cuda rng state on gpu tests for test_random.py (#1014)
JuanPedroGHM Aug 23, 2022
e56e6ec
Merge release/1.2.x into main
ClaudiaComito Sep 13, 2022
dd4a396
[pre-commit.ci] pre-commit autoupdate (#1024)
pre-commit-ci[bot] Sep 13, 2022
5ff0314
rename file and activate force push
mtar Sep 23, 2022
bd16040
Update bug_report.yml
mtar Sep 27, 2022
ba1225a
Update bug_report.yml
mtar Sep 27, 2022
9e4882a
Update README.md
mtar Sep 27, 2022
439d542
Update codecov.yml
mtar Sep 30, 2022
0949a48
Update codecov.yml
mtar Sep 30, 2022
398ac0d
Add section `Google Summer of Code 2022`
ClaudiaComito Oct 5, 2022
2fe13c3
Bug/1017 `prod` / `sum` with empty arrays (#1018)
neosunhan Oct 5, 2022
db44c93
Add section "Array API"
ClaudiaComito Oct 7, 2022
d5c14b7
Mirror Repository and run GitHub CI at HZDR (#1032)
mtar Oct 7, 2022
ea965ff
Bug/999 Fix `keepdim` in `any`/`all` (#1000)
neosunhan Oct 7, 2022
5ba9b32
[pre-commit.ci] pre-commit autoupdate
pre-commit-ci[bot] Oct 11, 2022
ae9dd50
Merge pull request #1033 from helmholtz-analytics/pre-commit-ci-updat…
mtar Oct 11, 2022
2c5c7be
Merge branch 'helmholtz-analytics:main' into RestrictWeightShape
Mystic-Slice Oct 22, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions heat/core/statistics.py
Original file line number Diff line number Diff line change
Expand Up @@ -206,14 +206,12 @@ def average(
Axis or axes along which to average ``x``. The default,
``axis=None``, will average over all of the elements of the input array.
If axis is negative it counts from the last to the first axis.
#TODO Issue #351: If axis is a tuple of ints, averaging is performed on all of the axes
specified in the tuple instead of a single axis or all the axes as
before.
weights : DNDarray, optional
An array of weights associated with the values in ``x``. Each value in
``x`` contributes to the average according to its associated weight.
The weights array can either be 1D (in which case its length must be
the size of ``x`` along the given axis) or of the same shape as ``x``.
Weighted average over tuple axis requires weights array to be of the same shape as ``x``.
If ``weights=None``, then all data in ``x`` are assumed to have a
weight equal to one, the result is equivalent to :func:`mean`.
returned : bool, optional
Expand Down Expand Up @@ -269,7 +267,9 @@ def average(
if axis is None:
raise TypeError("Axis must be specified when shapes of x and weights differ.")
elif isinstance(axis, tuple):
raise NotImplementedError("Weighted average over tuple axis not implemented yet.")
raise ValueError(
"Weighted average over tuple axis requires weights to be of the same shape as x."
)
if weights.ndim != 1:
raise TypeError("1D weights expected when shapes of x and weights differ.")
if weights.gshape[0] != x.gshape[axis]:
Expand Down