Skip to content

Commit

Permalink
[torch_xla2] Fix the test of cumsum (#7965)
Browse files Browse the repository at this point in the history
  • Loading branch information
guyao authored Sep 6, 2024
1 parent 901c3a3 commit 12e5958
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 1 deletion.
1 change: 0 additions & 1 deletion experimental/torch_xla2/test/test_ops.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,6 @@
"cross",
"cummax",
"cummin",
"cumsum",
"diag",
"diag_embed",
"diagflat",
Expand Down
2 changes: 2 additions & 0 deletions experimental/torch_xla2/torch_xla2/ops/jaten.py
Original file line number Diff line number Diff line change
Expand Up @@ -556,6 +556,8 @@ def _aten_ne(x, y):
def _aten_cumsum(x, y, dtype=None):
if dtype:
dtype = mappings.t2j_dtype(dtype)
if not x.shape:
return x
res = jnp.cumsum(x, y, dtype)
return res

Expand Down

0 comments on commit 12e5958

Please sign in to comment.