Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ignore h5py 2.10.0 warnings and fix invalid_netcdf warning test. #3301

Merged
merged 6 commits into from
Sep 13, 2019
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion xarray/tests/test_backends.py
Original file line number Diff line number Diff line change
Expand Up @@ -2163,6 +2163,7 @@ def test_encoding_unlimited_dims(self):

@requires_h5netcdf
@requires_netCDF4
@pytest.mark.filterwarnings("ignore:use make_scale(name) instead")
class TestH5NetCDFData(NetCDF4Base):
engine = "h5netcdf"

Expand All @@ -2171,10 +2172,11 @@ def create_store(self):
with create_tmp_file() as tmp_file:
yield backends.H5NetCDFStore(tmp_file, "w")

# TODO: Reduce num_warns by 1 when h5netcdf is updated to not issue the make_scale warning
@pytest.mark.filterwarnings("ignore:complex dtypes are supported by h5py")
@pytest.mark.parametrize(
"invalid_netcdf, warns, num_warns",
[(None, FutureWarning, 1), (False, FutureWarning, 1), (True, None, 0)],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we could just remove the assert len(record) == num_warns line below? I think pytest already verifies that we get a warning with pytest.warns.

We can use the match argument if we want to keep it more specific: https://docs.pytest.org/en/latest/warnings.html#warns

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the complication is testing that the warning is not raised when invalid_netcdf=True. I've changed it to count the number of warnings with the specified warning type and message so that it's more robust.

[(None, FutureWarning, 2), (False, FutureWarning, 2), (True, None, 1)],
)
def test_complex(self, invalid_netcdf, warns, num_warns):
expected = Dataset({"x": ("y", np.ones(5) + 1j * np.ones(5))})
Expand Down Expand Up @@ -2451,6 +2453,7 @@ def skip_if_not_engine(engine):


@requires_dask
@pytest.mark.filterwarnings("ignore:use make_scale(name) instead")
def test_open_mfdataset_manyfiles(
readengine, nfiles, parallel, chunks, file_cache_maxsize
):
Expand Down