-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Features/176 flatten #501
Features/176 flatten #501
Conversation
Codecov Report
@@ Coverage Diff @@
## master #501 +/- ##
=======================================
Coverage 96.39% 96.39%
=======================================
Files 75 75
Lines 14849 14887 +38
=======================================
+ Hits 14313 14351 +38
Misses 536 536
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this does flatten the array but i think that it might also destroy the global ordering. can you look into this?
What do you mean? |
x = torch.arange(18).reshape(3, 6)
x1 = ht.array(x, split=1)
print(x1)
print(ht.flatten(x1))
also there should be an inline function for flatten so one can call |
It's a problem of the ht.resplit function then, isn't it? x = torch.arange(18).reshape(3, 6)
x1 = ht.array(x, split=1)
x0 = ht.resplit(x1, 0)
print(ht.MPI_WORLD.rank, x0) 0 tensor([[ 0, 2, 7, 3, 5, 10],
[ 1, 6, 8, 4, 9, 11]])
1 tensor([[12, 13, 14, 15, 16, 17]]) |
#425 is already created. the resplit problem is known and it is being worked on but it is more complex then it first appears. that being said, your algorithm works for split=0. i do not have an eta on when the resplit function solution will be up. in the meantime a raise/warning can be implemented and then leave a comment on the mentioned error. that way when the issue is solved then the raise/warning will be removed |
replit fix is in the works. im going to say that we pause this PR until that is done. I have a working resplit algo (not touching split=None yet though) that i am working on finishing up. hopefully done in the coming days |
heat/core/manipulations.py
Outdated
# The resplit function scramble the tensor when switching axes, see issue 425 | ||
if a.split > 0: | ||
a = resplit(a, 0) | ||
warnings.warn("The flattened tensor may have a wrong order for split axes > 0", UserWarning) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Warning should be removed, after resplit is working.
Description
Include a summary of the change/s.
Please also include relevant motivation and context. List any dependencies that are required for this change.
implementation of flatten(). It returns a new flattened tensor.
Issue/s resolved: #176
Changes proposed:
Type of change
Remove irrelevant options:
Due Diligence
Does this change modify the behaviour of other functions? If so, which?
no