-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug/825 setitem slice dndarrays #826
Merged
Merged
Changes from 6 commits
Commits
Show all changes
23 commits
Select commit
Hold shift + click to select a range
639cd57
setitem can now set values with DNDarrays which are not the same size…
coquelin77 d1a0e55
added new test cases (simple)
coquelin77 28c268f
added oop redistribute to manipulations
coquelin77 61266c1
changelog update
coquelin77 174c792
added more test cases to increase coveraged and removed some dead code
coquelin77 1938f4a
abstracted section of setitem: key slice generation
coquelin77 66851a5
Merge branch 'master' into bug/825-setitem-slice-dndarrays
coquelin77 94af9f4
used key logic in getitem, added typehints/simple docstring to xitem_…
coquelin77 e4f5364
corrected false logic in key start stop adjustments
coquelin77 4960951
Merge branch 'master' into bug/825-setitem-slice-dndarrays
coquelin77 2a82e05
added a raise in setitem for when the value and self have different s…
coquelin77 dc77f17
Merge branch 'master' into bug/825-setitem-slice-dndarrays
coquelin77 30e2df4
added handling for single value DNDarrays in key for setitem
coquelin77 755c786
corrected try/expect in setitem to work with torch tensors as well
coquelin77 fe6ad27
Merge branch 'master' into bug/825-setitem-slice-dndarrays
coquelin77 8d08330
removing dead code
coquelin77 f193b0b
Verb correction in lshape map creation
coquelin77 34ba9c5
new changelog to add pending additions again
coquelin77 2affb59
Merge branch 'bug/825-setitem-slice-dndarrays' of https://github.com/…
coquelin77 3ec7b44
added tests for lshape map property and forced creation
coquelin77 cf8aa1a
corrected incorrect changelog, wrong line was moved the the pending a…
coquelin77 bf39b25
added raise test for splits != case in setitem
coquelin77 89fb977
new raise test now only runs on multiple processes
coquelin77 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -576,18 +576,18 @@ def cpu(self) -> DNDarray: | |
self.__device = devices.cpu | ||
return self | ||
|
||
def create_lshape_map(self, recreate: bool = True) -> torch.Tensor: | ||
def create_lshape_map(self, force_check: bool = True) -> torch.Tensor: | ||
""" | ||
Generate a 'map' of the lshapes of the data on all processes. | ||
Units are ``(process rank, lshape)`` | ||
|
||
Parameters | ||
---------- | ||
recreate : bool, optional | ||
force_check : bool, optional | ||
if False (default) and the lshape map has already been created, use the previous | ||
result. Otherwise, create the lshape_map | ||
""" | ||
if not recreate and self.__lshape_map is not None: | ||
if not force_check and self.__lshape_map is not None: | ||
return self.__lshape_map | ||
|
||
lshape_map = torch.zeros( | ||
|
@@ -1367,6 +1367,18 @@ def __setitem__( | |
[0., 1., 0., 0., 0.]]) | ||
""" | ||
key = getattr(key, "copy()", key) | ||
try: | ||
if value.split != self.split: | ||
val_split = int(value.split) | ||
sp = self.split | ||
warnings.warn( | ||
f"\nvalue.split {val_split} not equal to this DNDarray's split:" | ||
f" {sp}. this may cause errors or unwanted behavior", | ||
category=RuntimeWarning, | ||
Comment on lines
+1375
to
+1377
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 😘There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Do you have a test for the warning? No test according to codecov |
||
) | ||
except (AttributeError, TypeError): | ||
pass | ||
|
||
if isinstance(key, DNDarray) and key.ndim == self.ndim: | ||
# this splits the key into torch.Tensors in each dimension for advanced indexing | ||
lkey = [slice(None, None, None)] * self.ndim | ||
|
@@ -1392,6 +1404,13 @@ def __setitem__( | |
kend = key[ell_ind + 1 :] | ||
slices = [slice(None)] * (self.ndim - (len(kst) + len(kend))) | ||
key = kst + slices + kend | ||
|
||
for c, k in enumerate(key): | ||
try: | ||
key[c] = k.item() | ||
except (AttributeError, ValueError): | ||
pass | ||
|
||
key = tuple(key) | ||
|
||
if not self.is_distributed(): | ||
|
@@ -1411,6 +1430,8 @@ def __setitem__( | |
chunk_start = chunk_slice[self.split].start | ||
chunk_end = chunk_slice[self.split].stop | ||
|
||
self_proxy = torch.ones((1,)).as_strided(self.gshape, [0] * self.ndim) | ||
|
||
if not isinstance(key, tuple): | ||
return self.__setter(key, value) # returns None | ||
|
||
|
@@ -1448,7 +1469,6 @@ def __setitem__( | |
target_reshape_map = torch.zeros( | ||
(self.comm.size, self.ndim), dtype=torch.int, device=self.device.torch_device | ||
) | ||
self_proxy = torch.ones((1,)).as_strided(self.gshape, [0] * self.ndim) | ||
for r in range(self.comm.size): | ||
if r not in actives: | ||
loc_key = key.copy() | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pending Additions