Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

⚠️ Nightly upstream-dev CI failed ⚠️ #5600

Closed
github-actions bot opened this issue Jul 14, 2021 · 7 comments
Closed

⚠️ Nightly upstream-dev CI failed ⚠️ #5600

github-actions bot opened this issue Jul 14, 2021 · 7 comments
Labels
CI Continuous Integration tools

Comments

@github-actions
Copy link
Contributor

github-actions bot commented Jul 14, 2021

Workflow Run URL

Python 3.9 Test Summary Info

@github-actions github-actions bot added the CI Continuous Integration tools label Jul 14, 2021
@keewis
Copy link
Collaborator

keewis commented Jul 14, 2021

does anyone know what is causing this? A change to either zarr or fsspec, maybe?

cc @martindurant

For reference, here's the full traceback:
_______________________________ test_open_fsspec _______________________________

    @requires_zarr
    @requires_fsspec
    @pytest.mark.filterwarnings("ignore:deallocating CachingFileManager")
    def test_open_fsspec():
        import fsspec
        import zarr
    
        if not hasattr(zarr.storage, "FSStore") or not hasattr(
            zarr.storage.FSStore, "getitems"
        ):
            pytest.skip("zarr too old")
    
        ds = open_dataset(os.path.join(os.path.dirname(__file__), "data", "example_1.nc"))
    
        m = fsspec.filesystem("memory")
        mm = m.get_mapper("out1.zarr")
        ds.to_zarr(mm)  # old interface
        ds0 = ds.copy()
        ds0["time"] = ds.time + pd.to_timedelta("1 day")
        mm = m.get_mapper("out2.zarr")
        ds0.to_zarr(mm)  # old interface
    
        # single dataset
        url = "memory://out2.zarr"
        ds2 = open_dataset(url, engine="zarr")
        assert ds0 == ds2
    
        # single dataset with caching
        url = "simplecache::memory://out2.zarr"
>       ds2 = open_dataset(url, engine="zarr")

/home/runner/work/xarray/xarray/xarray/tests/test_backends.py:5150: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/home/runner/work/xarray/xarray/xarray/backends/api.py:497: in open_dataset
    backend_ds = backend.open_dataset(
/home/runner/work/xarray/xarray/xarray/backends/zarr.py:839: in open_dataset
    ds = store_entrypoint.open_dataset(
/home/runner/work/xarray/xarray/xarray/backends/store.py:27: in open_dataset
    vars, attrs, coord_names = conventions.decode_cf_variables(
/home/runner/work/xarray/xarray/xarray/conventions.py:512: in decode_cf_variables
    new_vars[k] = decode_cf_variable(
/home/runner/work/xarray/xarray/xarray/conventions.py:360: in decode_cf_variable
    var = times.CFDatetimeCoder(use_cftime=use_cftime).decode(var, name=name)
/home/runner/work/xarray/xarray/xarray/coding/times.py:527: in decode
    dtype = _decode_cf_datetime_dtype(data, units, calendar, self.use_cftime)
/home/runner/work/xarray/xarray/xarray/coding/times.py:145: in _decode_cf_datetime_dtype
    [first_n_items(values, 1) or [0], last_item(values) or [0]]
/home/runner/work/xarray/xarray/xarray/core/formatting.py:72: in first_n_items
    return np.asarray(array).flat[:n_desired]
/home/runner/work/xarray/xarray/xarray/core/indexing.py:354: in __array__
    return np.asarray(self.array, dtype=dtype)
/home/runner/work/xarray/xarray/xarray/core/indexing.py:419: in __array__
    return np.asarray(array[self.key], dtype=None)
/home/runner/work/xarray/xarray/xarray/backends/zarr.py:75: in __getitem__
    return array[key.tuple]
/usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/core.py:662: in __getitem__
    return self.get_basic_selection(selection, fields=fields)
/usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/core.py:787: in get_basic_selection
    return self._get_basic_selection_nd(selection=selection, out=out,
/usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/core.py:830: in _get_basic_selection_nd
    return self._get_selection(indexer=indexer, out=out, fields=fields)
/usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/core.py:1125: in _get_selection
    self._chunk_getitems(lchunk_coords, lchunk_selection, out, lout_selection,
/usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/core.py:1836: in _chunk_getitems
    cdatas = self.chunk_store.getitems(ckeys, on_error="omit")
/usr/share/miniconda/envs/xarray-tests/lib/python3.9/site-packages/zarr/storage.py:1085: in getitems
    results = self.map.getitems(keys_transformed, on_error="omit")
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <fsspec.mapping.FSMap object at 0x7f3172a8e9a0>, keys = ['time/0']
on_error = 'omit'

    def getitems(self, keys, on_error="raise"):
        """Fetch multiple items from the store
    
        If the backend is async-able, this might proceed concurrently
    
        Parameters
        ----------
        keys: list(str)
            They keys to be fetched
        on_error : "raise", "omit", "return"
            If raise, an underlying exception will be raised (converted to KeyError
            if the type is in self.missing_exceptions); if omit, keys with exception
            will simply not be included in the output; if "return", all keys are
            included in the output, but the value will be bytes or an exception
            instance.
    
        Returns
        -------
        dict(key, bytes|exception)
        """
        keys2 = [self._key_to_str(k) for k in keys]
        oe = on_error if on_error == "raise" else "return"
        try:
            out = self.fs.cat(keys2, on_error=oe)
        except self.missing_exceptions as e:
            raise KeyError from e
        out = {
            k: (KeyError() if isinstance(v, self.missing_exceptions) else v)
>           for k, v in out.items()
        }
E       AttributeError: 'bytes' object has no attribute 'items'

@martindurant
Copy link
Contributor

There was a release of fsspec, but I don't see why anything would have changed here. Can you see whether the failure is associated with the new version?

@keewis
Copy link
Collaborator

keewis commented Jul 15, 2021

there are a few changes to the environment between the last passing and the first failing run, but that does include the fsspec update.

I also just noticed that we don't test the upstream version of fsspec in the upstream-dev CI: should we change that?

@martindurant
Copy link
Contributor

should we change that?

Perhaps so? We are releasing pretty frequently, though, and if there is a problem here, we'd be happy to put out a bugfix.

@keewis
Copy link
Collaborator

keewis commented Jul 15, 2021

apparently something in distributed changed, too, causing the test collection phase to fail with a assertion error (something about timeout not being set appropriately in gen_cluster, see the logs). dask/distributed#5022, maybe? cc @crusaderky

@keewis
Copy link
Collaborator

keewis commented Jul 15, 2021

@martindurant, I think this is fsspec/filesystem_spec#707. Can you confirm?

@keewis
Copy link
Collaborator

keewis commented Jul 15, 2021

the CI is still running, but test_backends.py passes so I'm going to close this. As the issue was also in a released version of fsspec the normal CI will keep failing until the next release (which I guess should be soon).

Edit: thanks for helping with the debugging, @martindurant

@keewis keewis closed this as completed Jul 15, 2021
TomNicholas added a commit to TomNicholas/xarray that referenced this issue Jul 16, 2021
TomNicholas added a commit that referenced this issue Jul 21, 2021
* added to_numpy() and as_numpy() methods

* remove special-casing of cupy arrays in .values in favour of using .to_numpy()

* lint

* Fix mypy (I think?)

* added Dataset.as_numpy()

* improved docstrings

* add what's new

* add to API docs

* linting

* fix failures by only importing pint when needed

* refactor pycompat into class

* compute instead of load

* added tests

* fixed sparse test

* tests and fixes for ds.as_numpy()

* fix sparse tests

* fix linting

* tests for Variable

* test IndexVariable too

* use numpy.asarray to avoid a copy

* also convert coords

* Force tests again after #5600

* Apply suggestions from code review

* Update xarray/core/variable.py

* fix import

* formatting

* remove type check

Co-authored-by: Stephan Hoyer <shoyer@google.com>

* remove attempt to call to_numpy

Co-authored-by: Maximilian Roos <m@maxroos.com>
Co-authored-by: Deepak Cherian <dcherian@users.noreply.github.com>
Co-authored-by: Stephan Hoyer <shoyer@google.com>
TomNicholas added a commit that referenced this issue Jul 21, 2021
* test labels come from pint units

* values demotes pint arrays before returning

* plot labels look for pint units first

* pre-commit

* added to_numpy() and as_numpy() methods

* remove special-casing of cupy arrays in .values in favour of using .to_numpy()

* .values -> .to_numpy()

* lint

* Fix mypy (I think?)

* added Dataset.as_numpy()

* improved docstrings

* add what's new

* add to API docs

* linting

* fix failures by only importing pint when needed

* refactor pycompat into class

* pycompat import changes applied to plotting code

* what's new

* compute instead of load

* added tests

* fixed sparse test

* tests and fixes for ds.as_numpy()

* fix sparse tests

* fix linting

* tests for Variable

* test IndexVariable too

* use numpy.asarray to avoid a copy

* also convert coords

* Force tests again after #5600

Co-authored-by: Maximilian Roos <m@maxroos.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CI Continuous Integration tools
Projects
None yet
Development

No branches or pull requests

2 participants