Skip to content

Commit

Permalink
Merge remote-tracking branch 'upstream/master' into scipy19-docs
Browse files Browse the repository at this point in the history
* upstream/master: (43 commits)
  Add hypothesis support to related projects (#3335)
  More doc fixes (#3333)
  Improve the documentation of swap_dims (#3331)
  fix the doc names of the return value of swap_dims (#3329)
  Fix isel performance regression (#3319)
  Allow weakref (#3318)
  Clarify that "scatter" is a plotting method in what's new. (#3316)
  Fix whats-new date :/
  Revert to dev version
  Release v0.13.0
  auto_combine deprecation to 0.14 (#3314)
  Deprecation: groupby, resample default dim. (#3313)
  Raise error if cmap is list of colors (#3310)
  Refactor concat to use merge for non-concatenated variables (#3239)
  Honor `keep_attrs` in DataArray.quantile (#3305)
  Fix DataArray api doc (#3309)
  Accept int value in head, thin and tail (#3298)
  ignore h5py 2.10.0 warnings and fix invalid_netcdf warning test. (#3301)
  Update why-xarray.rst with clearer expression (#3307)
  Compat and encoding deprecation to 0.14 (#3294)
  ...
  • Loading branch information
dcherian committed Sep 24, 2019
2 parents f82d112 + e1183e8 commit aeb15b5
Show file tree
Hide file tree
Showing 68 changed files with 1,916 additions and 1,393 deletions.
1 change: 1 addition & 0 deletions asv_bench/benchmarks/combine.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
import numpy as np

import xarray as xr


Expand Down
13 changes: 13 additions & 0 deletions asv_bench/benchmarks/indexing.py
Original file line number Diff line number Diff line change
Expand Up @@ -125,3 +125,16 @@ def setup(self, key):
requires_dask()
super().setup(key)
self.ds = self.ds.chunk({"x": 100, "y": 50, "t": 50})


class BooleanIndexing:
# https://github.com/pydata/xarray/issues/2227
def setup(self):
self.ds = xr.Dataset(
{"a": ("time", np.arange(10_000_000))},
coords={"time": np.arange(10_000_000)},
)
self.time_filter = self.ds.time > 50_000

def time_indexing(self):
self.ds.isel(time=self.time_filter)
9 changes: 8 additions & 1 deletion doc/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ This page provides an auto-generated summary of xarray's API. For more details
and examples, refer to the relevant chapters in the main part of the
documentation.

See also: :ref:`public api`_.
See also: :ref:`public api`

Top-level functions
===================
Expand Down Expand Up @@ -117,6 +117,9 @@ Indexing
Dataset.loc
Dataset.isel
Dataset.sel
Dataset.head
Dataset.tail
Dataset.thin
Dataset.squeeze
Dataset.interp
Dataset.interp_like
Expand Down Expand Up @@ -279,6 +282,9 @@ Indexing
DataArray.loc
DataArray.isel
DataArray.sel
DataArray.head
DataArray.tail
DataArray.thin
DataArray.squeeze
DataArray.interp
DataArray.interp_like
Expand Down Expand Up @@ -604,6 +610,7 @@ Plotting

Dataset.plot
DataArray.plot
Dataset.plot.scatter
plot.plot
plot.contourf
plot.contour
Expand Down
20 changes: 11 additions & 9 deletions doc/dask.rst
Original file line number Diff line number Diff line change
Expand Up @@ -75,13 +75,14 @@ entirely equivalent to opening a dataset using ``open_dataset`` and then
chunking the data using the ``chunk`` method, e.g.,
``xr.open_dataset('example-data.nc').chunk({'time': 10})``.

To open multiple files simultaneously, use :py:func:`~xarray.open_mfdataset`::
To open multiple files simultaneously in parallel using Dask delayed,
use :py:func:`~xarray.open_mfdataset`::

xr.open_mfdataset('my/files/*.nc')
xr.open_mfdataset('my/files/*.nc', parallel=True)

This function will automatically concatenate and merge dataset into one in
the simple cases that it understands (see :py:func:`~xarray.auto_combine`
for the full disclaimer). By default, ``open_mfdataset`` will chunk each
for the full disclaimer). By default, :py:func:`~xarray.open_mfdataset` will chunk each
netCDF file into a single Dask array; again, supply the ``chunks`` argument to
control the size of the resulting Dask arrays. In more complex cases, you can
open each file individually using ``open_dataset`` and merge the result, as
Expand Down Expand Up @@ -132,6 +133,13 @@ A dataset can also be converted to a Dask DataFrame using :py:meth:`~xarray.Data
Dask DataFrames do not support multi-indexes so the coordinate variables from the dataset are included as columns in the Dask DataFrame.

.. ipython:: python
:suppress:
import os
os.remove('example-data.nc')
os.remove('manipulated-example-data.nc')
Using Dask with xarray
----------------------

Expand Down Expand Up @@ -373,12 +381,6 @@ one million elements (e.g., a 1000x1000 matrix). With large arrays (10+ GB), the
cost of queueing up Dask operations can be noticeable, and you may need even
larger chunksizes.

.. ipython:: python
:suppress:
import os
os.remove('example-data.nc')
Optimization Tips
-----------------

Expand Down
2 changes: 1 addition & 1 deletion doc/gallery/plot_cartopy_facetgrid.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,6 @@
ax.set_extent([-160, -30, 5, 75])
# Without this aspect attributes the maps will look chaotic and the
# "extent" attribute above will be ignored
ax.set_aspect("equal", "box-forced")
ax.set_aspect("equal")

plt.show()
11 changes: 1 addition & 10 deletions doc/indexing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -236,9 +236,8 @@ The :py:meth:`~xarray.Dataset.drop` method returns a new object with the listed
index labels along a dimension dropped:

.. ipython:: python
:okwarning:
ds.drop(['IN', 'IL'], dim='space')
ds.drop(space=['IN', 'IL'])
``drop`` is both a ``Dataset`` and ``DataArray`` method.

Expand Down Expand Up @@ -393,14 +392,6 @@ These methods may also be applied to ``Dataset`` objects
You may find increased performance by loading your data into memory first,
e.g., with :py:meth:`~xarray.Dataset.load`.

.. note::

Vectorized indexing is a new feature in v0.10.
In older versions of xarray, dimensions of indexers are ignored.
Dedicated methods for some advanced indexing use cases,
``isel_points`` and ``sel_points`` are now deprecated.
See :ref:`more_advanced_indexing` for their alternative.

.. note::

If an indexer is a :py:meth:`~xarray.DataArray`, its coordinates should not
Expand Down
Loading

0 comments on commit aeb15b5

Please sign in to comment.