-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Features/240 flipud #496
Features/240 flipud #496
Conversation
Codecov Report
@@ Coverage Diff @@
## master #496 +/- ##
=======================================
Coverage 96.62% 96.62%
=======================================
Files 65 65
Lines 14039 14055 +16
=======================================
+ Hits 13565 13581 +16
Misses 474 474
Continue to review full report at Codecov.
|
heat/core/manipulations.py
Outdated
>>> a = ht.array([[0,1],[2,3]]) | ||
>>> ht.flipud(a) | ||
tensor([[2, 3], | ||
[0, 1]]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
multi-process example
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Example added.
I noticed that the multiprocess examples differ in the formatting. Is there a style guide?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
there is no style guide on this issue. all that matters is that the processes are indicted
…heat into features/240-flipud
can you resolve the conflicts? the auto-merging doesnt play nice here |
I was expecting that as it is mostly the same code |
I think the easiest way will be to simply call flip |
that makes sense to me. let me know when the changes are made |
All done and ready to merge |
Description
Include a summary of the change/s.
Please also include relevant motivation and context. List any dependencies that are required for this change.
Naïve implementation of flipud. It returns a new upside down tensor. Will have redundant code to #498.
Issue/s resolved: #240
Changes proposed:
Type of change
Remove irrelevant options:
Due Diligence
Does this change modify the behaviour of other functions? If so, which?
no