- update to version .2024.11.0:

- This release brings better support for wrapping JAX arrays and Astropy Quantity objects, :py:meth:`DataTree.persist`, algorithmic improvements:
 - to many methods with dask (:py:meth:`Dataset.polyfit`, :py:meth:`Dataset.ffill`, :py:meth:`Dataset.bfill`, rolling reductions), and bug fixes.:
 - Thanks to the 22 contributors to this release:
 - Benoit Bovy, Deepak Cherian, Dimitri Papadopoulos Orfanos, Holly Mandel, James Bourbeau, Joe Hamman, Justus Magin, Kai Mühlbauer, Lukas Trippe, Mathias Hauser, Maximilian Roos, Michael Niklas, Pascal Bourgault, Patrick Hoefler, Sam Levang, Sarah Charlotte Johnson, Scott Huberty, Stephan Hoyer, Tom Nicholas, Virgile Andreani, joseph nowak and tvo:
 - New Features:
  - Added :py:meth:`DataTree.persist` method (:issue:`9675`, :pull:`9682`).
    By `Sam Levang <https://github.com/slevang>`_.
  - Added ``write_inherited_coords`` option to :py:meth:`DataTree.to_netcdf`
    and :py:meth:`DataTree.to_zarr` (:pull:`9677`).
    By `Stephan Hoyer <https://github.com/shoyer>`_.
  - Support lazy grouping by dask arrays, and allow specifying ordered groups with ``UniqueGrouper(labels=["a", "b", "c"])``
    (:issue:`2852`, :issue:`757`).
    By `Deepak Cherian <https://github.com/dcherian>`_.
  - Add new ``automatic_rechunk`` kwarg to :py:meth:`DataArrayRolling.construct` and
    :py:meth:`DatasetRolling.construct`. This is only useful on ``dask>=2024.11.0``
    (:issue:`9550`). By `Deepak Cherian <https://github.com/dcherian>`_.
  - Optimize ffill, bfill with dask when limit is specified
    (:pull:`9771`).
    By `Joseph Nowak <https://github.com/josephnowak>`_, and
    `Patrick Hoefler <https://github.com/phofl>`_.
  - Allow wrapping ``np.ndarray`` subclasses, e.g. ``astropy.units.Quantity`` (:issue:`9704`, :pull:`9760`).
    By `Sam Levang <https://github.com/slevang>`_ and `Tien Vo <https://github.com/tien-vo>`_.
  - Optimize :py:meth:`DataArray.polyfit` and :py:meth:`Dataset.polyfit` with dask, when used with
    arrays with more than two dimensions.
    (:issue:`5629`). By `Deepak Cherian <https://github.com/dcherian>`_.
  - Support for directly opening remote files as string paths (for example, ``s3://bucket/data.nc``)
    with ``fsspec`` when using the ``h5netcdf`` engine (:issue:`9723`, :pull:`9797`).
    By `James Bourbeau <https://github.com/jrbourbeau>`_.
  - Re-implement the :py:mod:`ufuncs` module, which now dynamically dispatches to the

OBS-URL: https://build.opensuse.org/package/show/devel:languages:python:numeric/python-xarray?expand=0&rev=104
This commit is contained in:
Sebastian Wagner 2024-11-24 13:24:06 +00:00 committed by Git OBS Bridge
parent b40279f857
commit 6297a9ddc6
8 changed files with 222 additions and 282 deletions

View File

@ -1,20 +1,17 @@
---
xarray/tutorial.py | 5 ++++-
1 file changed, 4 insertions(+), 1 deletion(-)
Index: xarray-2024.05.0/xarray/tutorial.py
===================================================================
--- xarray-2024.05.0.orig/xarray/tutorial.py
+++ xarray-2024.05.0/xarray/tutorial.py
@@ -158,7 +158,10 @@ def open_dataset(
url = f"{base_url}/raw/{version}/{path.name}"
--- xarray-2024.11.0/xarray/tutorial.py 2024-11-22 21:58:55.000000000 +0100
+++ xarray-2024.11.0/xarray/tutorial.py.new 2024-11-24 14:18:51.684909924 +0100
@@ -162,9 +162,11 @@
downloader = pooch.HTTPDownloader(headers=headers)
# retrieve the file
- filepath = pooch.retrieve(url=url, known_hash=None, path=cache_dir)
- filepath = pooch.retrieve(
- url=url, known_hash=None, path=cache_dir, downloader=downloader
- )
+ fname = pathlib.Path(cache_dir, path).expanduser()
+ if not fname.exists():
+ fname = None
+ filepath = pooch.retrieve(url=url, fname=fname, known_hash=None, path=cache_dir)
+
ds = _open_dataset(filepath, engine=engine, **kws)
if not cache:
ds = ds.load()

View File

@ -1,3 +1,209 @@
-------------------------------------------------------------------
Sun Nov 24 13:23:43 UTC 2024 - Sebastian Wagner <sebix@sebix.at>
- update to version .2024.11.0:
- This release brings better support for wrapping JAX arrays and Astropy Quantity objects, :py:meth:`DataTree.persist`, algorithmic improvements:
- to many methods with dask (:py:meth:`Dataset.polyfit`, :py:meth:`Dataset.ffill`, :py:meth:`Dataset.bfill`, rolling reductions), and bug fixes.:
- Thanks to the 22 contributors to this release:
- Benoit Bovy, Deepak Cherian, Dimitri Papadopoulos Orfanos, Holly Mandel, James Bourbeau, Joe Hamman, Justus Magin, Kai Mühlbauer, Lukas Trippe, Mathias Hauser, Maximilian Roos, Michael Niklas, Pascal Bourgault, Patrick Hoefler, Sam Levang, Sarah Charlotte Johnson, Scott Huberty, Stephan Hoyer, Tom Nicholas, Virgile Andreani, joseph nowak and tvo:
- New Features:
- Added :py:meth:`DataTree.persist` method (:issue:`9675`, :pull:`9682`).
By `Sam Levang <https://github.com/slevang>`_.
- Added ``write_inherited_coords`` option to :py:meth:`DataTree.to_netcdf`
and :py:meth:`DataTree.to_zarr` (:pull:`9677`).
By `Stephan Hoyer <https://github.com/shoyer>`_.
- Support lazy grouping by dask arrays, and allow specifying ordered groups with ``UniqueGrouper(labels=["a", "b", "c"])``
(:issue:`2852`, :issue:`757`).
By `Deepak Cherian <https://github.com/dcherian>`_.
- Add new ``automatic_rechunk`` kwarg to :py:meth:`DataArrayRolling.construct` and
:py:meth:`DatasetRolling.construct`. This is only useful on ``dask>=2024.11.0``
(:issue:`9550`). By `Deepak Cherian <https://github.com/dcherian>`_.
- Optimize ffill, bfill with dask when limit is specified
(:pull:`9771`).
By `Joseph Nowak <https://github.com/josephnowak>`_, and
`Patrick Hoefler <https://github.com/phofl>`_.
- Allow wrapping ``np.ndarray`` subclasses, e.g. ``astropy.units.Quantity`` (:issue:`9704`, :pull:`9760`).
By `Sam Levang <https://github.com/slevang>`_ and `Tien Vo <https://github.com/tien-vo>`_.
- Optimize :py:meth:`DataArray.polyfit` and :py:meth:`Dataset.polyfit` with dask, when used with
arrays with more than two dimensions.
(:issue:`5629`). By `Deepak Cherian <https://github.com/dcherian>`_.
- Support for directly opening remote files as string paths (for example, ``s3://bucket/data.nc``)
with ``fsspec`` when using the ``h5netcdf`` engine (:issue:`9723`, :pull:`9797`).
By `James Bourbeau <https://github.com/jrbourbeau>`_.
- Re-implement the :py:mod:`ufuncs` module, which now dynamically dispatches to the
underlying array's backend. Provides better support for certain wrapped array types
like ``jax.numpy.ndarray``. (:issue:`7848`, :pull:`9776`).
By `Sam Levang <https://github.com/slevang>`_.
- Speed up loading of large zarr stores using dask arrays. (:issue:`8902`)
By `Deepak Cherian <https://github.com/dcherian>`_.
- Breaking Changes:
- The minimum versions of some dependencies were changed
===================== ========= =======
Package Old New
===================== ========= =======
boto3 1.28 1.29
dask-core 2023.9 2023.11
distributed 2023.9 2023.11
h5netcdf 1.2 1.3
numbagg 0.2.1 0.6
typing_extensions 4.7 4.8
===================== ========= =======
- Deprecations:
- Grouping by a chunked array (e.g. dask or cubed) currently eagerly loads that variable in to
memory. This behaviour is deprecated. If eager loading was intended, please load such arrays
manually using ``.load()`` or ``.compute()``. Else pass ``eagerly_compute_group=False``, and
provide expected group labels using the ``labels`` kwarg to a grouper object such as
:py:class:`grouper.UniqueGrouper` or :py:class:`grouper.BinGrouper`.
- Bug fixes:
- Fix inadvertent deep-copying of child data in DataTree (:issue:`9683`,
:pull:`9684`).
By `Stephan Hoyer <https://github.com/shoyer>`_.
- Avoid including parent groups when writing DataTree subgroups to Zarr or
netCDF (:pull:`9682`).
By `Stephan Hoyer <https://github.com/shoyer>`_.
- Fix regression in the interoperability of :py:meth:`DataArray.polyfit` and :py:meth:`xr.polyval` for date-time coordinates. (:pull:`9691`).
By `Pascal Bourgault <https://github.com/aulemahal>`_.
- Fix CF decoding of ``grid_mapping`` to allow all possible formats, add tests (:issue:`9761`, :pull:`9765`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_.
- Add ``User-Agent`` to request-headers when retrieving tutorial data (:issue:`9774`, :pull:`9782`)
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_.
- Documentation:
- Mention attribute peculiarities in docs/docstrings (:issue:`4798`, :pull:`9700`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_.
- Internal Changes:
- ``persist`` methods now route through the :py:class:`xr.core.parallelcompat.ChunkManagerEntrypoint` (:pull:`9682`).
By `Sam Levang <https://github.com/slevang>`_.
- update to version 2024.10.0:
- This release brings official support for ``xarray.DataTree``, and compatibility with zarr-python v3!:
- Aside from these two huge features, it also improves support for vectorised interpolation and fixes various bugs.:
- Thanks to the 31 contributors to this release:
- Alfonso Ladino, DWesl, Deepak Cherian, Eni, Etienne Schalk, Holly Mandel, Ilan Gold, Illviljan, Joe Hamman, Justus Magin, Kai Mühlbauer, Karl Krauth, Mark Harfouche, Martey Dodoo, Matt Savoie, Maximilian Roos, Patrick Hoefler, Peter Hill, Renat Sibgatulin, Ryan Abernathey, Spencer Clark, Stephan Hoyer, Tom Augspurger, Tom Nicholas, Vecko, Virgile Andreani, Yvonne Fröhlich, carschandler, joseph nowak, mgunyho and owenlittlejohns:
- New Features:
- ``DataTree`` related functionality is now exposed in the main ``xarray`` public
API. This includes: ``xarray.DataTree``, ``xarray.open_datatree``, ``xarray.open_groups``,
``xarray.map_over_datasets``, ``xarray.group_subtrees``,
``xarray.register_datatree_accessor`` and ``xarray.testing.assert_isomorphic``.
By `Owen Littlejohns <https://github.com/owenlittlejohns>`_,
`Eni Awowale <https://github.com/eni-awowale>`_,
`Matt Savoie <https://github.com/flamingbear>`_,
`Stephan Hoyer <https://github.com/shoyer>`_,
`Tom Nicholas <https://github.com/TomNicholas>`_,
`Justus Magin <https://github.com/keewis>`_, and
`Alfonso Ladino <https://github.com/aladinor>`_.
- A migration guide for users of the prototype `xarray-contrib/datatree repository <https://github.com/xarray-contrib/datatree>`_ has been added, and can be found in the ``DATATREE_MIGRATION_GUIDE.md`` file in the repository root.
By `Tom Nicholas <https://github.com/TomNicholas>`_.
- Support for Zarr-Python 3 (:issue:`95515`, :pull:`9552`).
By `Tom Augspurger <https://github.com/TomAugspurger>`_,
`Ryan Abernathey <https://github.com/rabernat>`_ and
`Joe Hamman <https://github.com/jhamman>`_.
- Added zarr backends for :py:func:`open_groups` (:issue:`9430`, :pull:`9469`).
By `Eni Awowale <https://github.com/eni-awowale>`_.
- Added support for vectorized interpolation using additional interpolators
from the ``scipy.interpolate`` module (:issue:`9049`, :pull:`9526`).
By `Holly Mandel <https://github.com/hollymandel>`_.
- Implement handling of complex numbers (netcdf4/h5netcdf) and enums (h5netcdf) (:issue:`9246`, :issue:`3297`, :pull:`9509`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_.
- Fix passing missing arguments to when opening hdf5 and netCDF4 datatrees
(:issue:`9427`, :pull:`9428`).
By `Alfonso Ladino <https://github.com/aladinor>`_.
- Bug fixes:
- Make illegal path-like variable names when constructing a DataTree from a Dataset
(:issue:`9339`, :pull:`9378`)
By `Etienne Schalk <https://github.com/etienneschalk>`_.
- Work around `upstream pandas issue
<https://github.com/pandas-dev/pandas/issues/56996>`_ to ensure that we can
decode times encoded with small integer dtype values (e.g. ``np.int32``) in
environments with NumPy 2.0 or greater without needing to fall back to cftime
(:pull:`9518`). By `Spencer Clark <https://github.com/spencerkclark>`_.
- Fix bug when encoding times with missing values as floats in the case when
the non-missing times could in theory be encoded with integers
(:issue:`9488`, :pull:`9497`). By `Spencer Clark
<https://github.com/spencerkclark>`_.
- Fix a few bugs affecting groupby reductions with ``flox``. (:issue:`8090`, :issue:`9398`, :issue:`9648`).
- Fix a few bugs affecting groupby reductions with ``flox``. (:issue:`8090`, :issue:`9398`).
By `Deepak Cherian <https://github.com/dcherian>`_.
- Fix the safe_chunks validation option on the to_zarr method
(:issue:`5511`, :pull:`9559`). By `Joseph Nowak
<https://github.com/josephnowak>`_.
- Fix binning by multiple variables where some bins have no observations. (:issue:`9630`).
By `Deepak Cherian <https://github.com/dcherian>`_.
- Fix issue where polyfit wouldn't handle non-dimension coordinates. (:issue:`4375`, :pull:`9369`)
By `Karl Krauth <https://github.com/Karl-Krauth>`_.
- Documentation:
- Migrate documentation for ``datatree`` into main ``xarray`` documentation (:pull:`9033`).
For information on previous ``datatree`` releases, please see:
`datatree's historical release notes <https://xarray-datatree.readthedocs.io/en/latest/>`_.
By `Owen Littlejohns <https://github.com/owenlittlejohns>`_, `Matt Savoie <https://github.com/flamingbear>`_, and
`Tom Nicholas <https://github.com/TomNicholas>`_.
- Internal Changes:
- update to version 2024.09.0:
- This release drops support for Python 3.9, and adds support for grouping by :ref:`multiple arrays <groupby.multiple>`, while providing numerous performance improvements and bug fixes.:
- Thanks to the 33 contributors to this release:
- Alfonso Ladino, Andrew Scherer, Anurag Nayak, David Hoese, Deepak Cherian, Diogo Teles Sant'Anna, Dom, Elliott Sales de Andrade, Eni, Holly Mandel, Illviljan, Jack Kelly, Julius Busecke, Justus Magin, Kai Mühlbauer, Manish Kumar Gupta, Matt Savoie, Maximilian Roos, Michele Claus, Miguel Jimenez, Niclas Rieger, Pascal Bourgault, Philip Chmielowiec, Spencer Clark, Stephan Hoyer, Tao Xin, Tiago Sanona, TimothyCera-NOAA, Tom Nicholas, Tom White, Virgile Andreani, oliverhiggs and tiago:
- New Features:
- Add :py:attr:`~core.accessor_dt.DatetimeAccessor.days_in_year` and
:py:attr:`~core.accessor_dt.DatetimeAccessor.decimal_year` to the
``DatetimeAccessor`` on ``xr.DataArray``. (:pull:`9105`).
By `Pascal Bourgault <https://github.com/aulemahal>`_.
- Performance:
- Make chunk manager an option in ``set_options`` (:pull:`9362`).
By `Tom White <https://github.com/tomwhite>`_.
- Support for :ref:`grouping by multiple variables <groupby.multiple>`.
This is quite new, so please check your results and report bugs.
Binary operations after grouping by multiple arrays are not supported yet.
(:issue:`1056`, :issue:`9332`, :issue:`324`, :pull:`9372`).
By `Deepak Cherian <https://github.com/dcherian>`_.
- Allow data variable specific ``constant_values`` in the dataset ``pad`` function (:pull:`9353`).
By `Tiago Sanona <https://github.com/tsanona>`_.
- Speed up grouping by avoiding deep-copy of non-dimension coordinates (:issue:`9426`, :pull:`9393`)
By `Deepak Cherian <https://github.com/dcherian>`_.
- Breaking changes:
- Support for ``python 3.9`` has been dropped (:pull:`8937`)
- The minimum versions of some dependencies were changed
===================== ========= =======
Package Old New
===================== ========= =======
boto3 1.26 1.28
cartopy 0.21 0.22
dask-core 2023.4 2023.9
distributed 2023.4 2023.9
h5netcdf 1.1 1.2
iris 3.4 3.7
numba 0.56 0.57
numpy 1.23 1.24
pandas 2.0 2.1
scipy 1.10 1.11
typing_extensions 4.5 4.7
zarr 2.14 2.16
===================== ========= =======
- Bug fixes:
- Fix bug with rechunking to a frequency when some periods contain no data (:issue:`9360`).
By `Deepak Cherian <https://github.com/dcherian>`_.
- Fix bug causing ``DataTree.from_dict`` to be sensitive to insertion order (:issue:`9276`, :pull:`9292`).
By `Tom Nicholas <https://github.com/TomNicholas>`_.
- Fix resampling error with monthly, quarterly, or yearly frequencies with
cftime when the time bins straddle the date "0001-01-01". For example, this
can happen in certain circumstances when the time coordinate contains the
date "0001-01-01". (:issue:`9108`, :pull:`9116`) By `Spencer Clark
<https://github.com/spencerkclark>`_ and `Deepak Cherian
<https://github.com/dcherian>`_.
- Fix issue with passing parameters to ZarrStore.open_store when opening
datatree in zarr format (:issue:`9376`, :pull:`9377`).
By `Alfonso Ladino <https://github.com/aladinor>`_
- Fix deprecation warning that was raised when calling ``np.array`` on an ``xr.DataArray``
in NumPy 2.0 (:issue:`9312`, :pull:`9393`)
By `Andrew Scherer <https://github.com/andrew-s28>`_.
- Fix passing missing arguments to when opening hdf5 and netCDF4 datatrees
(:issue:`9427`, :pull:`9428`).
By `Alfonso Ladino <https://github.com/aladinor>`_.
- Fix support for using ``pandas.DateOffset``, ``pandas.Timedelta``, and
``datetime.timedelta`` objects as ``resample`` frequencies
(:issue:`9408`, :pull:`9413`).
By `Oliver Higgs <https://github.com/oliverhiggs>`_.
- Internal Changes:
- Re-enable testing ``pydap`` backend with ``numpy>=2`` (:pull:`9391`).
By `Miguel Jimenez <https://github.com/Mikejmnez>`_ .
-------------------------------------------------------------------
Sun Nov 24 13:08:38 UTC 2024 - Sebastian Wagner <sebix@sebix.at>

View File

@ -25,11 +25,11 @@
%define psuffix %{nil}
%endif
%define ghversion 2024.07.0
%define ghversion 2024.11.0
%{?sle15_python_module_pythons}
Name: python-xarray%{psuffix}
Version: 2024.7.0
Version: 2024.11.0
Release: 0
Summary: N-D labeled arrays and datasets in Python
License: Apache-2.0
@ -38,12 +38,6 @@ Source: https://github.com/pydata/xarray/archive/refs/tags/v%{ghversion}
# PATCH-FEATURE-UPSTREAM local_dataset.patch gh#pydata/xarray#5377 mcepl@suse.com
# fix xr.tutorial.open_dataset to work with the preloaded cache.
Patch0: local_dataset.patch
# PATCH-FIX-UPSTREAM xarray-pr9321-dasktests.patch gh#pydata/xarray#9321
Patch1: xarray-pr9321-dasktests.patch
# PATCH-FIX-UPSTREAM xarray-pr9356-dasktests.patch gh#pydata/xarray#9356
Patch2: xarray-pr9356-dasktests.patch
# PATCH-FIX-UPSTREAM xarray-pr9403-np2.1-scalar.patch gh#pydata/xarray#9403
Patch3: xarray-pr9403-np2.1-scalar.patch
BuildRequires: %{python_module base >= 3.9}
BuildRequires: %{python_module pip}
BuildRequires: %{python_module setuptools_scm}
@ -133,6 +127,8 @@ The [io] extra for xarray, N-D labeled arrays and datasets in Python
#%%package parallel
#Summary: The python xarray[parallel] extra
#Requires: python-dask-complete
@ -140,6 +136,7 @@ The [io] extra for xarray, N-D labeled arrays and datasets in Python
#
#%description parallel
#The [parallel] extra for xarray, N-D labeled arrays and datasets in Python
%package viz
Summary: The python xarray[viz] extra
Requires: python-matplotlib

View File

@ -1,3 +0,0 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3db5160a699a7731fba26b42aef3f175ca3a6adfe5593bebd0b7af90e55d747d
size 3756547

View File

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b3b22095e1069ec6dee4694fd2da64147eac75eb1161206b8f970841786e0056
size 3236300

View File

@ -1,118 +0,0 @@
From 9406c49fb281d9ffbf88bfd46133288bd23649a4 Mon Sep 17 00:00:00 2001
From: Deepak Cherian <deepak@cherian.net>
Date: Tue, 6 Aug 2024 22:21:29 -0600
Subject: [PATCH 1/2] Fix some dask tests
---
xarray/tests/test_dask.py | 18 +++++++++++-------
1 file changed, 11 insertions(+), 7 deletions(-)
diff --git a/xarray/tests/test_dask.py b/xarray/tests/test_dask.py
index 20491eca91a..1ef759b3d6a 100644
--- a/xarray/tests/test_dask.py
+++ b/xarray/tests/test_dask.py
@@ -640,8 +640,10 @@ def counting_get(*args, **kwargs):
def test_duplicate_dims(self):
data = np.random.normal(size=(4, 4))
- arr = DataArray(data, dims=("x", "x"))
- chunked_array = arr.chunk({"x": 2})
+ with pytest.warns(UserWarning, match="Duplicate dimension"):
+ arr = DataArray(data, dims=("x", "x"))
+ with pytest.warns(UserWarning, match="Duplicate dimension"):
+ chunked_array = arr.chunk({"x": 2})
assert chunked_array.chunks == ((2, 2), (2, 2))
assert chunked_array.chunksizes == {"x": (2, 2)}
@@ -1364,7 +1366,8 @@ def test_map_blocks_ds_transformations(func, map_ds):
@pytest.mark.parametrize("obj", [make_da(), make_ds()])
def test_map_blocks_da_ds_with_template(obj):
func = lambda x: x.isel(x=[1])
- template = obj.isel(x=[1, 5, 9])
+ # a simple .isel(x=[1, 5, 9]) puts all those in a single chunk.
+ template = xr.concat([obj.isel(x=[i]) for i in [1, 5, 9]], dim="x")
with raise_if_dask_computes():
actual = xr.map_blocks(func, obj, template=template)
assert_identical(actual, template)
@@ -1395,15 +1398,16 @@ def test_map_blocks_roundtrip_string_index():
def test_map_blocks_template_convert_object():
da = make_da()
+ ds = da.to_dataset()
+
func = lambda x: x.to_dataset().isel(x=[1])
- template = da.to_dataset().isel(x=[1, 5, 9])
+ template = xr.concat([da.to_dataset().isel(x=[i]) for i in [1, 5, 9]], dim="x")
with raise_if_dask_computes():
actual = xr.map_blocks(func, da, template=template)
assert_identical(actual, template)
- ds = da.to_dataset()
func = lambda x: x.to_dataarray().isel(x=[1])
- template = ds.to_dataarray().isel(x=[1, 5, 9])
+ template = xr.concat([ds.to_dataarray().isel(x=[i]) for i in [1, 5, 9]], dim="x")
with raise_if_dask_computes():
actual = xr.map_blocks(func, ds, template=template)
assert_identical(actual, template)
@@ -1429,7 +1433,7 @@ def test_map_blocks_errors_bad_template(obj):
xr.map_blocks(
lambda a: a.isel(x=[1]).assign_coords(x=[120]), # assign bad index values
obj,
- template=obj.isel(x=[1, 5, 9]),
+ template=xr.concat([obj.isel(x=[i]) for i in [1, 5, 9]], dim="x"),
).compute()
From 6fa200e542fe18b99a86a53126c10639192ea5e1 Mon Sep 17 00:00:00 2001
From: Deepak Cherian <deepak@cherian.net>
Date: Tue, 6 Aug 2024 22:29:24 -0600
Subject: [PATCH 2/2] Cleanup
---
xarray/tests/test_variable.py | 11 +++++------
1 file changed, 5 insertions(+), 6 deletions(-)
diff --git a/xarray/tests/test_variable.py b/xarray/tests/test_variable.py
index 3f3f1756e45..ff6522c00eb 100644
--- a/xarray/tests/test_variable.py
+++ b/xarray/tests/test_variable.py
@@ -318,12 +318,11 @@ def test_datetime64_valid_range(self):
with pytest.raises(pderror, match=r"Out of bounds nanosecond"):
self.cls(["t"], [data])
- @pytest.mark.xfail(reason="pandas issue 36615")
@pytest.mark.filterwarnings("ignore:Converting non-nanosecond")
def test_timedelta64_valid_range(self):
data = np.timedelta64("200000", "D")
pderror = pd.errors.OutOfBoundsTimedelta
- with pytest.raises(pderror, match=r"Out of bounds nanosecond"):
+ with pytest.raises(pderror, match=r"Cannot convert"):
self.cls(["t"], [data])
def test_pandas_data(self):
@@ -2301,20 +2300,20 @@ def test_chunk(self):
assert blocked.chunks == ((3,), (3, 1))
assert blocked.data.name != first_dask_name
- @pytest.mark.xfail
+ @pytest.mark.skip
def test_0d_object_array_with_list(self):
super().test_0d_object_array_with_list()
- @pytest.mark.xfail
+ @pytest.mark.skip
def test_array_interface(self):
# dask array does not have `argsort`
super().test_array_interface()
- @pytest.mark.xfail
+ @pytest.mark.skip
def test_copy_index(self):
super().test_copy_index()
- @pytest.mark.xfail
+ @pytest.mark.skip
@pytest.mark.filterwarnings("ignore:elementwise comparison failed.*:FutureWarning")
def test_eq_all_dtypes(self):
super().test_eq_all_dtypes()

View File

@ -1,98 +0,0 @@
From 70e3f30d5a636f6d847acb2dd0d12cffeb601d41 Mon Sep 17 00:00:00 2001
From: Deepak Cherian <deepak@cherian.net>
Date: Tue, 13 Aug 2024 19:47:10 -0600
Subject: [PATCH 1/2] xfail np.cross tests
xref #9327
---
xarray/core/computation.py | 6 +++---
xarray/tests/test_computation.py | 12 ++++++++----
2 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/xarray/core/computation.py b/xarray/core/computation.py
index 5d21d0836b9..bb7122e82de 100644
--- a/xarray/core/computation.py
+++ b/xarray/core/computation.py
@@ -23,7 +23,7 @@
from xarray.core.merge import merge_attrs, merge_coordinates_without_align
from xarray.core.options import OPTIONS, _get_keep_attrs
from xarray.core.types import Dims, T_DataArray
-from xarray.core.utils import is_dict_like, is_duck_dask_array, is_scalar, parse_dims
+from xarray.core.utils import is_dict_like, is_scalar, parse_dims
from xarray.core.variable import Variable
from xarray.namedarray.parallelcompat import get_chunked_array_type
from xarray.namedarray.pycompat import is_chunked_array
@@ -1693,11 +1693,11 @@ def cross(
if a.sizes[dim] < b.sizes[dim]:
a = a.pad({dim: (0, 1)}, constant_values=0)
# TODO: Should pad or apply_ufunc handle correct chunking?
- a = a.chunk({dim: -1}) if is_duck_dask_array(a.data) else a
+ a = a.chunk({dim: -1}) if is_chunked_array(a.data) else a
else:
b = b.pad({dim: (0, 1)}, constant_values=0)
# TODO: Should pad or apply_ufunc handle correct chunking?
- b = b.chunk({dim: -1}) if is_duck_dask_array(b.data) else b
+ b = b.chunk({dim: -1}) if is_chunked_array(b.data) else b
else:
raise ValueError(
f"{dim!r} on {'a' if a.sizes[dim] == 1 else 'b'} is incompatible:"
diff --git a/xarray/tests/test_computation.py b/xarray/tests/test_computation.py
index 8b480b02472..e974b8b1ac8 100644
--- a/xarray/tests/test_computation.py
+++ b/xarray/tests/test_computation.py
@@ -2547,7 +2547,8 @@ def test_polyfit_polyval_integration(
"cartesian",
1,
],
- [ # Test 1 sized arrays with coords:
+ # Test 1 sized arrays with coords:
+ pytest.param(
xr.DataArray(
np.array([1]),
dims=["cartesian"],
@@ -2562,8 +2563,10 @@ def test_polyfit_polyval_integration(
np.array([4, 5, 6]),
"cartesian",
-1,
- ],
- [ # Test filling in between with coords:
+ marks=(pytest.mark.xfail(),),
+ ),
+ # Test filling in between with coords:
+ pytest.param(
xr.DataArray(
[1, 2],
dims=["cartesian"],
@@ -2578,7 +2581,8 @@ def test_polyfit_polyval_integration(
np.array([4, 5, 6]),
"cartesian",
-1,
- ],
+ marks=(pytest.mark.xfail(),),
+ ),
],
)
def test_cross(a, b, ae, be, dim: str, axis: int, use_dask: bool) -> None:
From deb9e3266ca163575b200960c14c87fc999dcfc6 Mon Sep 17 00:00:00 2001
From: Deepak Cherian <deepak@cherian.net>
Date: Tue, 13 Aug 2024 19:49:56 -0600
Subject: [PATCH 2/2] Force numpy>=2
---
ci/requirements/environment.yml | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/ci/requirements/environment.yml b/ci/requirements/environment.yml
index ef02a3e7f23..40ef4a7fc74 100644
--- a/ci/requirements/environment.yml
+++ b/ci/requirements/environment.yml
@@ -26,7 +26,7 @@ dependencies:
- numba
- numbagg
- numexpr
- - numpy
+ - numpy>=2
- opt_einsum
- packaging
- pandas

View File

@ -1,44 +0,0 @@
From 17367f3545a48d8b8a18bf8f7054b19351c255dc Mon Sep 17 00:00:00 2001
From: Justus Magin <keewis@posteo.de>
Date: Tue, 27 Aug 2024 15:18:32 +0200
Subject: [PATCH 1/3] also call `np.asarray` on numpy scalars
---
xarray/core/variable.py | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
Index: xarray-2024.07.0/xarray/core/variable.py
===================================================================
--- xarray-2024.07.0.orig/xarray/core/variable.py
+++ xarray-2024.07.0/xarray/core/variable.py
@@ -309,7 +309,7 @@ def as_compatible_data(
else:
data = np.asarray(data)
- if not isinstance(data, np.ndarray) and (
+ if not isinstance(data, np.ndarray | np.generic) and (
hasattr(data, "__array_function__") or hasattr(data, "__array_namespace__")
):
return cast("T_DuckArray", data)
Index: xarray-2024.07.0/xarray/tests/test_variable.py
===================================================================
--- xarray-2024.07.0.orig/xarray/tests/test_variable.py
+++ xarray-2024.07.0/xarray/tests/test_variable.py
@@ -2585,10 +2585,15 @@ class TestAsCompatibleData(Generic[T_Duc
assert source_ndarray(x) is source_ndarray(as_compatible_data(x))
def test_converted_types(self):
- for input_array in [[[0, 1, 2]], pd.DataFrame([[0, 1, 2]])]:
+ for input_array in [
+ [[0, 1, 2]],
+ pd.DataFrame([[0, 1, 2]]),
+ np.float64(1.4),
+ np.str_("abc"),
+ ]:
actual = as_compatible_data(input_array)
assert_array_equal(np.asarray(input_array), actual)
- assert np.ndarray == type(actual)
+ assert np.ndarray is type(actual)
assert np.asarray(input_array).dtype == actual.dtype
def test_masked_array(self):