-
Notifications
You must be signed in to change notification settings - Fork 283
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rebuilt flatten auxcoords #3379
Conversation
…ract calls) Adding some tests and fixing some edge cases Removing unused import Add comment and rename var for clarity Add comment and rename var for clarity Minor fix Retain lazy data Support aux coords on leading dimensions, and split across non-neighbouring dimensions. The routine chooses the first dimension of the AuxCoord as the new flat one. Fix whitespace Fix issue with multiple dim coords being flattened PEP8 conformance Long lines Factor out a function for flattening arbitrary dimensions Fix case for coordinates with bounds Flatten all dims by default Remove unused imports Remove unused imports
…ions. Other minor whitespace and text changes
c not in coords_to_ignore and | ||
c not in cube.derived_coords] | ||
|
||
new_data = cube.core_data().reshape(new_shape) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this will only work sensibly if dims
are consecutive and increasing. One of the tests has dims=(0, 2)
, so I think we need a transpose in here.
Note that non-increasing auxcoord dims are possible (#2606).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
|
||
dims = dims if dims is not None else all_dims | ||
|
||
other_dims = list(set(all_dims) - set(dims)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This set of numbers needs to be sorted into order : it is not guaranteed.
|
||
other_dims = list(set(all_dims) - set(dims)) | ||
shape_array = np.asarray(cube.shape) | ||
# The new (flat) dimension will be in the first dimension of the aux coord |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not true if dims are not in order, hence dims[0] != min(dims)
flat_dim = dims[0] | ||
dim_shape = shape_array[list(dims)] | ||
new_dim_shape = np.product(dim_shape) | ||
new_shape = np.insert(shape_array[other_dims], flat_dim, new_dim_shape) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are problems with this calc, even when the dims are in order.
Examples testing this part of the code, extracted into a standalone function ...
def newshape(shape, dims):
all_dims = range(len(shape))
shape_array = np.asarray(shape)
flat_dim = dims[0]
dim_shape = shape_array[list(dims)]
new_dim_shape = np.product(dim_shape)
new_shape = np.insert(shape_array[other_dims], flat_dim, new_dim_shape)
return new_shape
Then
newshape((10, 11, 12, 13), (0, 2)) --> [120, 11, 13]
is ok, but ..
newshape((10, 11, 12, 13), (2, 0)) --> [ 11, 13, 120]
is wrong : (reverse dims order) : should be same as previous (??)
Likewise ...
newshape((10, 11, 12, 13), (1, 2)) --> [ 11, 132, 13]
is wrong : should give (10, 132, 12)
# Any DimCoords spanning these dimensions have been demoted so | ||
# there should only be AuxCoords left | ||
new_aux_coords.append((c.copy(c.points.flat, new_bounds), flat_dim)) | ||
# TODO: Deal with Aux Factories and cell measures.... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As stated, I think these aspects need dealing with, or at least to detect and issue an error.
Given outstanding problems, I think we are better going back to the original PR + waiting for @duncanwp to respond |
WIP: do not merge
Cherry-picked the later versions of @duncanwp 's commits from #3030