Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix map_blocks examples #4305

Merged
merged 1 commit into from
Aug 4, 2020
Merged

Conversation

TomAugspurger
Copy link
Contributor

The examples on master raised with

ValueError: Result from applying user function has unexpected coordinate variables {'month'}.

This PR updates the example to include the month coordinate. pytest --doctest-modules passes on these three now.

@@ -3358,9 +3358,12 @@ def map_blocks(
... clim = gb.mean(dim="time")
... return gb - clim
>>> time = xr.cftime_range("1990-01", "1992-01", freq="M")
>>> month = xr.DataArray(time.month, coords={"time": time}, dims=["time"])
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hopefully this is the best way to include this coordinate. Just doing DataArray(..., dims=["time"], coords={"time": time, "month": time.month}) raised with

ValueError: coordinate month has dimensions ('month',), but these are not a subset of the DataArray dimensions ['time']

@max-sixty
Copy link
Collaborator

Thanks @TomAugspurger , appreciate the fix.

This is part of the broader #3837

@TomAugspurger
Copy link
Contributor Author

The doc failure looks unrelated:


>>>-------------------------------------------------------------------------
Exception in /home/docs/checkouts/readthedocs.org/user_builds/xray/checkouts/4305/doc/plotting.rst at block ending on line None
Specify :okexcept: as an option in the ipython:: block to suppress this message
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-75-c7d6afd7f8c5> in <module>
----> 1 g_simple = t.plot(x="lon", y="lat", col="time", col_wrap=3)

~/checkouts/readthedocs.org/user_builds/xray/checkouts/4305/xarray/plot/plot.py in __call__(self, **kwargs)
    444 
    445     def __call__(self, **kwargs):
--> 446         return plot(self._da, **kwargs)
    447 
    448     # we can't use functools.wraps here since that also modifies the name / qualname

~/checkouts/readthedocs.org/user_builds/xray/checkouts/4305/xarray/plot/plot.py in plot(darray, row, col, col_wrap, ax, hue, rtol, subplot_kws, **kwargs)
    198     kwargs["ax"] = ax
    199 
--> 200     return plotfunc(darray, **kwargs)
    201 
    202 

~/checkouts/readthedocs.org/user_builds/xray/checkouts/4305/xarray/plot/plot.py in newplotfunc(darray, x, y, figsize, size, aspect, ax, row, col, col_wrap, xincrease, yincrease, add_colorbar, add_labels, vmin, vmax, cmap, center, robust, extend, levels, infer_intervals, colors, subplot_kws, cbar_ax, cbar_kwargs, xscale, yscale, xticks, yticks, xlim, ylim, norm, **kwargs)
    636             # Need the decorated plotting function
    637             allargs["plotfunc"] = globals()[plotfunc.__name__]
--> 638             return _easy_facetgrid(darray, kind="dataarray", **allargs)
    639 
    640         plt = import_matplotlib_pyplot()

~/checkouts/readthedocs.org/user_builds/xray/checkouts/4305/xarray/plot/facetgrid.py in _easy_facetgrid(data, plotfunc, kind, x, y, row, col, col_wrap, sharex, sharey, aspect, size, subplot_kws, ax, figsize, **kwargs)
    642 
    643     if kind == "dataarray":
--> 644         return g.map_dataarray(plotfunc, x, y, **kwargs)
    645 
    646     if kind == "dataset":

~/checkouts/readthedocs.org/user_builds/xray/checkouts/4305/xarray/plot/facetgrid.py in map_dataarray(self, func, x, y, **kwargs)
    263         # Get x, y labels for the first subplot
    264         x, y = _infer_xy_labels(
--> 265             darray=self.data.loc[self.name_dicts.flat[0]],
    266             x=x,
    267             y=y,

~/checkouts/readthedocs.org/user_builds/xray/checkouts/4305/xarray/core/dataarray.py in __getitem__(self, key)
    196             labels = indexing.expanded_indexer(key, self.data_array.ndim)
    197             key = dict(zip(self.data_array.dims, labels))
--> 198         return self.data_array.sel(**key)
    199 
    200     def __setitem__(self, key, value) -> None:

~/checkouts/readthedocs.org/user_builds/xray/checkouts/4305/xarray/core/dataarray.py in sel(self, indexers, method, tolerance, drop, **indexers_kwargs)
   1147 
   1148         """
-> 1149         ds = self._to_temp_dataset().sel(
   1150             indexers=indexers,
   1151             drop=drop,

~/checkouts/readthedocs.org/user_builds/xray/checkouts/4305/xarray/core/dataset.py in sel(self, indexers, method, tolerance, drop, **indexers_kwargs)
   2099         """
   2100         indexers = either_dict_or_kwargs(indexers, indexers_kwargs, "sel")
-> 2101         pos_indexers, new_indexes = remap_label_indexers(
   2102             self, indexers=indexers, method=method, tolerance=tolerance
   2103         )

~/checkouts/readthedocs.org/user_builds/xray/checkouts/4305/xarray/core/coordinates.py in remap_label_indexers(obj, indexers, method, tolerance, **indexers_kwargs)
    394     }
    395 
--> 396     pos_indexers, new_indexes = indexing.remap_label_indexers(
    397         obj, v_indexers, method=method, tolerance=tolerance
    398     )

~/checkouts/readthedocs.org/user_builds/xray/checkouts/4305/xarray/core/indexing.py in remap_label_indexers(data_obj, indexers, method, tolerance)
    268             coords_dtype = data_obj.coords[dim].dtype
    269             label = maybe_cast_to_coords_dtype(label, coords_dtype)
--> 270             idxr, new_idx = convert_label_indexer(index, label, dim, method, tolerance)
    271             pos_indexers[dim] = idxr
    272             if new_idx is not None:

~/checkouts/readthedocs.org/user_builds/xray/checkouts/4305/xarray/core/indexing.py in convert_label_indexer(index, label, index_name, method, tolerance)
    187                 indexer = index.get_loc(label.item())
    188             else:
--> 189                 indexer = index.get_loc(
    190                     label.item(), method=method, tolerance=tolerance
    191                 )

~/checkouts/readthedocs.org/user_builds/xray/conda/4305/lib/python3.8/site-packages/pandas/core/indexes/datetimes.py in get_loc(self, key, method, tolerance)
    620         else:
    621             # unrecognized type
--> 622             raise KeyError(key)
    623 
    624         try:

KeyError: 1356998400000000000
<<<-------------------------------------------------------------------------

@max-sixty max-sixty merged commit e1dafe6 into pydata:master Aug 4, 2020
@max-sixty
Copy link
Collaborator

That does look unrelated. Thanks @TomAugspurger

@max-sixty
Copy link
Collaborator

Separately, I'm crunched at the moment but if anyone has any cycles to look at that failure that would be great. Doc build has not been kind to us recently...

@keewis
Copy link
Collaborator

keewis commented Aug 4, 2020

the failure is #4283 and should be fixed by #4292. Also, there are warnings about *args, **kwargs, which will be fixed by the upcoming sphinx release (scheduled to be released in about 4 days). If we don't want to wait for that, I guess we can disable fail_on_warnings again?

dcherian added a commit to rpgoldman/xarray that referenced this pull request Aug 14, 2020
* 'master' of github.com:pydata/xarray: (260 commits)
  Increase support window of all dependencies (pydata#4296)
  Implement interp for interpolating between chunks of data (dask) (pydata#4155)
  Add @mathause to current core developers. (pydata#4335)
  install sphinx-autosummary-accessors from conda-forge (pydata#4332)
  Use sphinx-accessors-autosummary (pydata#4323)
  ndrolling fixes (pydata#4329)
  DOC: fix typo argmin -> argmax in DataArray.argmax docstring (pydata#4327)
  pin sphinx to 3.1(pydata#4326)
  nd-rolling (pydata#4219)
  Implicit dask import 4164 (pydata#4318)
  allow customizing the inline repr of a duck array (pydata#4248)
  silence the known docs CI issues (pydata#4316)
  enh: fixed pydata#4302 (pydata#4315)
  Remove all unused and warn-raising methods from AbstractDataStore (pydata#4310)
  Fix map_blocks example (pydata#4305)
  Fix docstring for missing_dims argument to isel methods (pydata#4298)
  Support for PyCharm remote deployment (pydata#4299)
  Update map_blocks and map_overlap docstrings (pydata#4303)
  Lazily load resource files (pydata#4297)
  warn about the removal of the ufuncs (pydata#4268)
  ...
dcherian added a commit to dcherian/xarray that referenced this pull request Aug 15, 2020
* upstream/master: (34 commits)
  Fix bug in computing means of cftime.datetime arrays (pydata#4344)
  fix some str accessor inconsistencies (pydata#4339)
  pin matplotlib in ci/requirements/doc.yml (pydata#4340)
  Clarify drop_vars return value. (pydata#4244)
  Support explicitly setting a dimension order with to_dataframe() (pydata#4333)
  Increase support window of all dependencies (pydata#4296)
  Implement interp for interpolating between chunks of data (dask) (pydata#4155)
  Add @mathause to current core developers. (pydata#4335)
  install sphinx-autosummary-accessors from conda-forge (pydata#4332)
  Use sphinx-accessors-autosummary (pydata#4323)
  ndrolling fixes (pydata#4329)
  DOC: fix typo argmin -> argmax in DataArray.argmax docstring (pydata#4327)
  pin sphinx to 3.1(pydata#4326)
  nd-rolling (pydata#4219)
  Implicit dask import 4164 (pydata#4318)
  allow customizing the inline repr of a duck array (pydata#4248)
  silence the known docs CI issues (pydata#4316)
  enh: fixed pydata#4302 (pydata#4315)
  Remove all unused and warn-raising methods from AbstractDataStore (pydata#4310)
  Fix map_blocks example (pydata#4305)
  ...
dcherian added a commit to dcherian/xarray that referenced this pull request Aug 16, 2020
* upstream/master: (40 commits)
  Fix bug in computing means of cftime.datetime arrays (pydata#4344)
  fix some str accessor inconsistencies (pydata#4339)
  pin matplotlib in ci/requirements/doc.yml (pydata#4340)
  Clarify drop_vars return value. (pydata#4244)
  Support explicitly setting a dimension order with to_dataframe() (pydata#4333)
  Increase support window of all dependencies (pydata#4296)
  Implement interp for interpolating between chunks of data (dask) (pydata#4155)
  Add @mathause to current core developers. (pydata#4335)
  install sphinx-autosummary-accessors from conda-forge (pydata#4332)
  Use sphinx-accessors-autosummary (pydata#4323)
  ndrolling fixes (pydata#4329)
  DOC: fix typo argmin -> argmax in DataArray.argmax docstring (pydata#4327)
  pin sphinx to 3.1(pydata#4326)
  nd-rolling (pydata#4219)
  Implicit dask import 4164 (pydata#4318)
  allow customizing the inline repr of a duck array (pydata#4248)
  silence the known docs CI issues (pydata#4316)
  enh: fixed pydata#4302 (pydata#4315)
  Remove all unused and warn-raising methods from AbstractDataStore (pydata#4310)
  Fix map_blocks example (pydata#4305)
  ...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants