Compare commits

..

No commits in common. "factory" and "devel" have entirely different histories.

11 changed files with 1170 additions and 283 deletions

View File

@ -1,17 +1,20 @@
--- xarray-2024.11.0/xarray/tutorial.py 2024-11-22 21:58:55.000000000 +0100 ---
+++ xarray-2024.11.0/xarray/tutorial.py.new 2024-11-24 14:18:51.684909924 +0100 xarray/tutorial.py | 5 ++++-
@@ -162,9 +162,11 @@ 1 file changed, 4 insertions(+), 1 deletion(-)
downloader = pooch.HTTPDownloader(headers=headers)
Index: xarray-2024.05.0/xarray/tutorial.py
===================================================================
--- xarray-2024.05.0.orig/xarray/tutorial.py
+++ xarray-2024.05.0/xarray/tutorial.py
@@ -158,7 +158,10 @@ def open_dataset(
url = f"{base_url}/raw/{version}/{path.name}"
# retrieve the file # retrieve the file
- filepath = pooch.retrieve( - filepath = pooch.retrieve(url=url, known_hash=None, path=cache_dir)
- url=url, known_hash=None, path=cache_dir, downloader=downloader
- )
+ fname = pathlib.Path(cache_dir, path).expanduser() + fname = pathlib.Path(cache_dir, path).expanduser()
+ if not fname.exists(): + if not fname.exists():
+ fname = None + fname = None
+ filepath = pooch.retrieve(url=url, fname=fname, known_hash=None, path=cache_dir) + filepath = pooch.retrieve(url=url, fname=fname, known_hash=None, path=cache_dir)
+
ds = _open_dataset(filepath, engine=engine, **kws) ds = _open_dataset(filepath, engine=engine, **kws)
if not cache: if not cache:
ds = ds.load() ds = ds.load()

View File

@ -1,224 +1,3 @@
-------------------------------------------------------------------
Fri Dec 6 12:27:55 UTC 2024 - Ben Greiner <code@bnavigator.de>
- Skip another type induced test error on 32-bit
-------------------------------------------------------------------
Sun Nov 24 13:23:43 UTC 2024 - Sebastian Wagner <sebix@sebix.at>
- skip test test_asi8 on 32bit, results in "OverflowError: Python int too large to convert to C long"
- delete obsolete patches xarray-pr9356-dasktests.patch, xarray-pr9321-dasktests.patch and xarray-pr9403-np2.1-scalar.patch
- update to version .2024.11.0:
- This release brings better support for wrapping JAX arrays and Astropy Quantity objects, :py:meth:`DataTree.persist`, algorithmic improvements:
- to many methods with dask (:py:meth:`Dataset.polyfit`, :py:meth:`Dataset.ffill`, :py:meth:`Dataset.bfill`, rolling reductions), and bug fixes.:
- Thanks to the 22 contributors to this release:
- Benoit Bovy, Deepak Cherian, Dimitri Papadopoulos Orfanos, Holly Mandel, James Bourbeau, Joe Hamman, Justus Magin, Kai Mühlbauer, Lukas Trippe, Mathias Hauser, Maximilian Roos, Michael Niklas, Pascal Bourgault, Patrick Hoefler, Sam Levang, Sarah Charlotte Johnson, Scott Huberty, Stephan Hoyer, Tom Nicholas, Virgile Andreani, joseph nowak and tvo:
- New Features:
- Added :py:meth:`DataTree.persist` method (:issue:`9675`, :pull:`9682`).
By `Sam Levang <https://github.com/slevang>`_.
- Added ``write_inherited_coords`` option to :py:meth:`DataTree.to_netcdf`
and :py:meth:`DataTree.to_zarr` (:pull:`9677`).
By `Stephan Hoyer <https://github.com/shoyer>`_.
- Support lazy grouping by dask arrays, and allow specifying ordered groups with ``UniqueGrouper(labels=["a", "b", "c"])``
(:issue:`2852`, :issue:`757`).
By `Deepak Cherian <https://github.com/dcherian>`_.
- Add new ``automatic_rechunk`` kwarg to :py:meth:`DataArrayRolling.construct` and
:py:meth:`DatasetRolling.construct`. This is only useful on ``dask>=2024.11.0``
(:issue:`9550`). By `Deepak Cherian <https://github.com/dcherian>`_.
- Optimize ffill, bfill with dask when limit is specified
(:pull:`9771`).
By `Joseph Nowak <https://github.com/josephnowak>`_, and
`Patrick Hoefler <https://github.com/phofl>`_.
- Allow wrapping ``np.ndarray`` subclasses, e.g. ``astropy.units.Quantity`` (:issue:`9704`, :pull:`9760`).
By `Sam Levang <https://github.com/slevang>`_ and `Tien Vo <https://github.com/tien-vo>`_.
- Optimize :py:meth:`DataArray.polyfit` and :py:meth:`Dataset.polyfit` with dask, when used with
arrays with more than two dimensions.
(:issue:`5629`). By `Deepak Cherian <https://github.com/dcherian>`_.
- Support for directly opening remote files as string paths (for example, ``s3://bucket/data.nc``)
with ``fsspec`` when using the ``h5netcdf`` engine (:issue:`9723`, :pull:`9797`).
By `James Bourbeau <https://github.com/jrbourbeau>`_.
- Re-implement the :py:mod:`ufuncs` module, which now dynamically dispatches to the
underlying array's backend. Provides better support for certain wrapped array types
like ``jax.numpy.ndarray``. (:issue:`7848`, :pull:`9776`).
By `Sam Levang <https://github.com/slevang>`_.
- Speed up loading of large zarr stores using dask arrays. (:issue:`8902`)
By `Deepak Cherian <https://github.com/dcherian>`_.
- Breaking Changes:
- The minimum versions of some dependencies were changed
===================== ========= =======
Package Old New
===================== ========= =======
boto3 1.28 1.29
dask-core 2023.9 2023.11
distributed 2023.9 2023.11
h5netcdf 1.2 1.3
numbagg 0.2.1 0.6
typing_extensions 4.7 4.8
===================== ========= =======
- Deprecations:
- Grouping by a chunked array (e.g. dask or cubed) currently eagerly loads that variable in to
memory. This behaviour is deprecated. If eager loading was intended, please load such arrays
manually using ``.load()`` or ``.compute()``. Else pass ``eagerly_compute_group=False``, and
provide expected group labels using the ``labels`` kwarg to a grouper object such as
:py:class:`grouper.UniqueGrouper` or :py:class:`grouper.BinGrouper`.
- Bug fixes:
- Fix inadvertent deep-copying of child data in DataTree (:issue:`9683`,
:pull:`9684`).
By `Stephan Hoyer <https://github.com/shoyer>`_.
- Avoid including parent groups when writing DataTree subgroups to Zarr or
netCDF (:pull:`9682`).
By `Stephan Hoyer <https://github.com/shoyer>`_.
- Fix regression in the interoperability of :py:meth:`DataArray.polyfit` and :py:meth:`xr.polyval` for date-time coordinates. (:pull:`9691`).
By `Pascal Bourgault <https://github.com/aulemahal>`_.
- Fix CF decoding of ``grid_mapping`` to allow all possible formats, add tests (:issue:`9761`, :pull:`9765`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_.
- Add ``User-Agent`` to request-headers when retrieving tutorial data (:issue:`9774`, :pull:`9782`)
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_.
- Documentation:
- Mention attribute peculiarities in docs/docstrings (:issue:`4798`, :pull:`9700`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_.
- Internal Changes:
- ``persist`` methods now route through the :py:class:`xr.core.parallelcompat.ChunkManagerEntrypoint` (:pull:`9682`).
By `Sam Levang <https://github.com/slevang>`_.
- update to version 2024.10.0:
- This release brings official support for ``xarray.DataTree``, and compatibility with zarr-python v3!:
- Aside from these two huge features, it also improves support for vectorised interpolation and fixes various bugs.:
- Thanks to the 31 contributors to this release:
- Alfonso Ladino, DWesl, Deepak Cherian, Eni, Etienne Schalk, Holly Mandel, Ilan Gold, Illviljan, Joe Hamman, Justus Magin, Kai Mühlbauer, Karl Krauth, Mark Harfouche, Martey Dodoo, Matt Savoie, Maximilian Roos, Patrick Hoefler, Peter Hill, Renat Sibgatulin, Ryan Abernathey, Spencer Clark, Stephan Hoyer, Tom Augspurger, Tom Nicholas, Vecko, Virgile Andreani, Yvonne Fröhlich, carschandler, joseph nowak, mgunyho and owenlittlejohns:
- New Features:
- ``DataTree`` related functionality is now exposed in the main ``xarray`` public
API. This includes: ``xarray.DataTree``, ``xarray.open_datatree``, ``xarray.open_groups``,
``xarray.map_over_datasets``, ``xarray.group_subtrees``,
``xarray.register_datatree_accessor`` and ``xarray.testing.assert_isomorphic``.
By `Owen Littlejohns <https://github.com/owenlittlejohns>`_,
`Eni Awowale <https://github.com/eni-awowale>`_,
`Matt Savoie <https://github.com/flamingbear>`_,
`Stephan Hoyer <https://github.com/shoyer>`_,
`Tom Nicholas <https://github.com/TomNicholas>`_,
`Justus Magin <https://github.com/keewis>`_, and
`Alfonso Ladino <https://github.com/aladinor>`_.
- A migration guide for users of the prototype `xarray-contrib/datatree repository <https://github.com/xarray-contrib/datatree>`_ has been added, and can be found in the ``DATATREE_MIGRATION_GUIDE.md`` file in the repository root.
By `Tom Nicholas <https://github.com/TomNicholas>`_.
- Support for Zarr-Python 3 (:issue:`95515`, :pull:`9552`).
By `Tom Augspurger <https://github.com/TomAugspurger>`_,
`Ryan Abernathey <https://github.com/rabernat>`_ and
`Joe Hamman <https://github.com/jhamman>`_.
- Added zarr backends for :py:func:`open_groups` (:issue:`9430`, :pull:`9469`).
By `Eni Awowale <https://github.com/eni-awowale>`_.
- Added support for vectorized interpolation using additional interpolators
from the ``scipy.interpolate`` module (:issue:`9049`, :pull:`9526`).
By `Holly Mandel <https://github.com/hollymandel>`_.
- Implement handling of complex numbers (netcdf4/h5netcdf) and enums (h5netcdf) (:issue:`9246`, :issue:`3297`, :pull:`9509`).
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_.
- Fix passing missing arguments to when opening hdf5 and netCDF4 datatrees
(:issue:`9427`, :pull:`9428`).
By `Alfonso Ladino <https://github.com/aladinor>`_.
- Bug fixes:
- Make illegal path-like variable names when constructing a DataTree from a Dataset
(:issue:`9339`, :pull:`9378`)
By `Etienne Schalk <https://github.com/etienneschalk>`_.
- Work around `upstream pandas issue
<https://github.com/pandas-dev/pandas/issues/56996>`_ to ensure that we can
decode times encoded with small integer dtype values (e.g. ``np.int32``) in
environments with NumPy 2.0 or greater without needing to fall back to cftime
(:pull:`9518`). By `Spencer Clark <https://github.com/spencerkclark>`_.
- Fix bug when encoding times with missing values as floats in the case when
the non-missing times could in theory be encoded with integers
(:issue:`9488`, :pull:`9497`). By `Spencer Clark
<https://github.com/spencerkclark>`_.
- Fix a few bugs affecting groupby reductions with ``flox``. (:issue:`8090`, :issue:`9398`, :issue:`9648`).
- Fix a few bugs affecting groupby reductions with ``flox``. (:issue:`8090`, :issue:`9398`).
By `Deepak Cherian <https://github.com/dcherian>`_.
- Fix the safe_chunks validation option on the to_zarr method
(:issue:`5511`, :pull:`9559`). By `Joseph Nowak
<https://github.com/josephnowak>`_.
- Fix binning by multiple variables where some bins have no observations. (:issue:`9630`).
By `Deepak Cherian <https://github.com/dcherian>`_.
- Fix issue where polyfit wouldn't handle non-dimension coordinates. (:issue:`4375`, :pull:`9369`)
By `Karl Krauth <https://github.com/Karl-Krauth>`_.
- Documentation:
- Migrate documentation for ``datatree`` into main ``xarray`` documentation (:pull:`9033`).
For information on previous ``datatree`` releases, please see:
`datatree's historical release notes <https://xarray-datatree.readthedocs.io/en/latest/>`_.
By `Owen Littlejohns <https://github.com/owenlittlejohns>`_, `Matt Savoie <https://github.com/flamingbear>`_, and
`Tom Nicholas <https://github.com/TomNicholas>`_.
- Internal Changes:
- update to version 2024.09.0:
- This release drops support for Python 3.9, and adds support for grouping by :ref:`multiple arrays <groupby.multiple>`, while providing numerous performance improvements and bug fixes.:
- Thanks to the 33 contributors to this release:
- Alfonso Ladino, Andrew Scherer, Anurag Nayak, David Hoese, Deepak Cherian, Diogo Teles Sant'Anna, Dom, Elliott Sales de Andrade, Eni, Holly Mandel, Illviljan, Jack Kelly, Julius Busecke, Justus Magin, Kai Mühlbauer, Manish Kumar Gupta, Matt Savoie, Maximilian Roos, Michele Claus, Miguel Jimenez, Niclas Rieger, Pascal Bourgault, Philip Chmielowiec, Spencer Clark, Stephan Hoyer, Tao Xin, Tiago Sanona, TimothyCera-NOAA, Tom Nicholas, Tom White, Virgile Andreani, oliverhiggs and tiago:
- New Features:
- Add :py:attr:`~core.accessor_dt.DatetimeAccessor.days_in_year` and
:py:attr:`~core.accessor_dt.DatetimeAccessor.decimal_year` to the
``DatetimeAccessor`` on ``xr.DataArray``. (:pull:`9105`).
By `Pascal Bourgault <https://github.com/aulemahal>`_.
- Performance:
- Make chunk manager an option in ``set_options`` (:pull:`9362`).
By `Tom White <https://github.com/tomwhite>`_.
- Support for :ref:`grouping by multiple variables <groupby.multiple>`.
This is quite new, so please check your results and report bugs.
Binary operations after grouping by multiple arrays are not supported yet.
(:issue:`1056`, :issue:`9332`, :issue:`324`, :pull:`9372`).
By `Deepak Cherian <https://github.com/dcherian>`_.
- Allow data variable specific ``constant_values`` in the dataset ``pad`` function (:pull:`9353`).
By `Tiago Sanona <https://github.com/tsanona>`_.
- Speed up grouping by avoiding deep-copy of non-dimension coordinates (:issue:`9426`, :pull:`9393`)
By `Deepak Cherian <https://github.com/dcherian>`_.
- Breaking changes:
- Support for ``python 3.9`` has been dropped (:pull:`8937`)
- The minimum versions of some dependencies were changed
===================== ========= =======
Package Old New
===================== ========= =======
boto3 1.26 1.28
cartopy 0.21 0.22
dask-core 2023.4 2023.9
distributed 2023.4 2023.9
h5netcdf 1.1 1.2
iris 3.4 3.7
numba 0.56 0.57
numpy 1.23 1.24
pandas 2.0 2.1
scipy 1.10 1.11
typing_extensions 4.5 4.7
zarr 2.14 2.16
===================== ========= =======
- Bug fixes:
- Fix bug with rechunking to a frequency when some periods contain no data (:issue:`9360`).
By `Deepak Cherian <https://github.com/dcherian>`_.
- Fix bug causing ``DataTree.from_dict`` to be sensitive to insertion order (:issue:`9276`, :pull:`9292`).
By `Tom Nicholas <https://github.com/TomNicholas>`_.
- Fix resampling error with monthly, quarterly, or yearly frequencies with
cftime when the time bins straddle the date "0001-01-01". For example, this
can happen in certain circumstances when the time coordinate contains the
date "0001-01-01". (:issue:`9108`, :pull:`9116`) By `Spencer Clark
<https://github.com/spencerkclark>`_ and `Deepak Cherian
<https://github.com/dcherian>`_.
- Fix issue with passing parameters to ZarrStore.open_store when opening
datatree in zarr format (:issue:`9376`, :pull:`9377`).
By `Alfonso Ladino <https://github.com/aladinor>`_
- Fix deprecation warning that was raised when calling ``np.array`` on an ``xr.DataArray``
in NumPy 2.0 (:issue:`9312`, :pull:`9393`)
By `Andrew Scherer <https://github.com/andrew-s28>`_.
- Fix passing missing arguments to when opening hdf5 and netCDF4 datatrees
(:issue:`9427`, :pull:`9428`).
By `Alfonso Ladino <https://github.com/aladinor>`_.
- Fix support for using ``pandas.DateOffset``, ``pandas.Timedelta``, and
``datetime.timedelta`` objects as ``resample`` frequencies
(:issue:`9408`, :pull:`9413`).
By `Oliver Higgs <https://github.com/oliverhiggs>`_.
- Internal Changes:
- Re-enable testing ``pydap`` backend with ``numpy>=2`` (:pull:`9391`).
By `Miguel Jimenez <https://github.com/Mikejmnez>`_ .
-------------------------------------------------------------------
Sun Nov 24 13:08:38 UTC 2024 - Sebastian Wagner <sebix@sebix.at>
- disable the 'parallel' subpackage because dask is unavailable 3.12, which is because numba is unavailable on 3.13
https://build.opensuse.org/request/show/1225144
https://github.com/numba/numba/issues/9760
- disabled tests requiring dask
------------------------------------------------------------------- -------------------------------------------------------------------
Wed Sep 4 09:11:37 UTC 2024 - Ben Greiner <code@bnavigator.de> Wed Sep 4 09:11:37 UTC 2024 - Ben Greiner <code@bnavigator.de>
@ -1398,7 +1177,7 @@ Sun Aug 7 03:12:48 UTC 2022 - Arun Persaud <arun@gmx.de>
details. (PR5692). By Benoît Bovy. details. (PR5692). By Benoît Bovy.
------------------------------------------------------------------- -------------------------------------------------------------------
Fri Mar 4 18:00:26 UTC 2022 - Sebastian Wagner <sebix@sebix.at> Fri Mar 4 18:00:26 UTC 2022 - Sebastian Wagner <sebix+novell.com@sebix.at>
- - update to version 2022.03.0: - - update to version 2022.03.0:
- This release brings a number of small improvements, as well as a move to `calendar versioning <https://calver.org/>`_ (:issue:`6176`).: - This release brings a number of small improvements, as well as a move to `calendar versioning <https://calver.org/>`_ (:issue:`6176`).:
@ -1438,7 +1217,7 @@ Fri Mar 4 18:00:26 UTC 2022 - Sebastian Wagner <sebix@sebix.at>
By `Sebastian Weigand <https://github.com/s-weigand>`_ and `Joe Hamman <https://github.com/jhamman>`_. By `Sebastian Weigand <https://github.com/s-weigand>`_ and `Joe Hamman <https://github.com/jhamman>`_.
------------------------------------------------------------------- -------------------------------------------------------------------
Wed Feb 2 19:56:06 UTC 2022 - Sebastian Wagner <sebix@sebix.at> Wed Feb 2 19:56:06 UTC 2022 - Sebastian Wagner <sebix+novell.com@sebix.at>
- - update to version 0.21.1: - - update to version 0.21.1:
- This is a bugfix release to resolve (:issue:`6216`, :pull:`6207`).: - This is a bugfix release to resolve (:issue:`6216`, :pull:`6207`).:
@ -1447,7 +1226,7 @@ Wed Feb 2 19:56:06 UTC 2022 - Sebastian Wagner <sebix@sebix.at>
By `Sebastian Weigand <https://github.com/s-weigand>`_ and `Joe Hamman <https://github.com/jhamman>`_. By `Sebastian Weigand <https://github.com/s-weigand>`_ and `Joe Hamman <https://github.com/jhamman>`_.
------------------------------------------------------------------- -------------------------------------------------------------------
Sat Jan 29 09:23:51 UTC 2022 - Sebastian Wagner <sebix@sebix.at> Sat Jan 29 09:23:51 UTC 2022 - Sebastian Wagner <sebix+novell.com@sebix.at>
- update to version 0.21.0: - update to version 0.21.0:
- Many thanks to the 20 contributors to the v0.21.0 release!: - Many thanks to the 20 contributors to the v0.21.0 release!:
@ -1504,7 +1283,7 @@ Mon Jan 24 16:28:40 UTC 2022 - Ben Greiner <code@bnavigator.de>
yet yet
------------------------------------------------------------------- -------------------------------------------------------------------
Fri Dec 10 08:01:01 UTC 2021 - Sebastian Wagner <sebix@sebix.at> Fri Dec 10 08:01:01 UTC 2021 - Sebastian Wagner <sebix+novell.com@sebix.at>
- update to version 0.20.2: - update to version 0.20.2:
- This is a bugfix release to resolve (:issue:`3391`, :issue:`5715`). It also: - This is a bugfix release to resolve (:issue:`3391`, :issue:`5715`). It also:
@ -1539,7 +1318,7 @@ Fri Dec 10 08:01:01 UTC 2021 - Sebastian Wagner <sebix@sebix.at>
By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_. By `Kai Mühlbauer <https://github.com/kmuehlbauer>`_.
------------------------------------------------------------------- -------------------------------------------------------------------
Mon Nov 15 14:46:40 UTC 2021 - Sebastian Wagner <sebix@sebix.at> Mon Nov 15 14:46:40 UTC 2021 - Sebastian Wagner <sebix+novell.com@sebix.at>
- update to version 0.20.1: - update to version 0.20.1:
- This is a bugfix release to fix :issue:`5930`.: - This is a bugfix release to fix :issue:`5930`.:
@ -1550,7 +1329,7 @@ Mon Nov 15 14:46:40 UTC 2021 - Sebastian Wagner <sebix@sebix.at>
- Significant improvements to :ref:`api`. By `Deepak Cherian <https://github.com/dcherian>`_. - Significant improvements to :ref:`api`. By `Deepak Cherian <https://github.com/dcherian>`_.
------------------------------------------------------------------- -------------------------------------------------------------------
Tue Nov 2 19:44:34 UTC 2021 - Sebastian Wagner <sebix@sebix.at> Tue Nov 2 19:44:34 UTC 2021 - Sebastian Wagner <sebix+novell.com@sebix.at>
- update to version 0.20.0: - update to version 0.20.0:
- This release brings improved support for pint arrays, methods for weighted standard deviation, variance,: - This release brings improved support for pint arrays, methods for weighted standard deviation, variance,:
@ -1680,7 +1459,7 @@ Tue Sep 21 10:38:39 UTC 2021 - Ben Greiner <code@bnavigator.de>
- Tests require dask[diagnostics] extra now (for Jinja2) - Tests require dask[diagnostics] extra now (for Jinja2)
------------------------------------------------------------------- -------------------------------------------------------------------
Tue Jul 27 11:01:52 UTC 2021 - Sebastian Wagner <sebix@sebix.at> Tue Jul 27 11:01:52 UTC 2021 - Sebastian Wagner <sebix+novell.com@sebix.at>
- remove xarray-pr5449-dask-meta.patch, merged upstream. - remove xarray-pr5449-dask-meta.patch, merged upstream.
- remove test_resample_loffset.patch, merged upstream. - remove test_resample_loffset.patch, merged upstream.
@ -1799,20 +1578,20 @@ Tue May 25 17:25:46 UTC 2021 - Matej Cepl <mcepl@suse.com>
test (gh#pydata/xarray#5364). test (gh#pydata/xarray#5364).
------------------------------------------------------------------- -------------------------------------------------------------------
Thu May 20 11:58:50 UTC 2021 - Sebastian Wagner <sebix@sebix.at> Thu May 20 11:58:50 UTC 2021 - Sebastian Wagner <sebix+novell.com@sebix.at>
- update to version 0.18.2: - update to version 0.18.2:
- This release reverts a regression in xarray's unstacking of dask-backed arrays.: - This release reverts a regression in xarray's unstacking of dask-backed arrays.:
- remove fix_test_resample_loffset.patch, doesn't work - remove fix_test_resample_loffset.patch, doesn't work
------------------------------------------------------------------- -------------------------------------------------------------------
Wed May 19 13:53:54 UTC 2021 - Sebastian Wagner <sebix@sebix.at> Wed May 19 13:53:54 UTC 2021 - Sebastian Wagner <sebix+novell.com@sebix.at>
- add fix_test_resample_loffset.patch to fix test fail on i586 - add fix_test_resample_loffset.patch to fix test fail on i586
https://github.com/pydata/xarray/issues/5341 https://github.com/pydata/xarray/issues/5341
------------------------------------------------------------------- -------------------------------------------------------------------
Wed May 19 07:26:03 UTC 2021 - Sebastian Wagner <sebix@sebix.at> Wed May 19 07:26:03 UTC 2021 - Sebastian Wagner <sebix+novell.com@sebix.at>
- update to version 0.18.1: - update to version 0.18.1:
- This release is intended as a small patch release to be compatible with the new: - This release is intended as a small patch release to be compatible with the new:
@ -1858,7 +1637,7 @@ Wed May 19 07:26:03 UTC 2021 - Sebastian Wagner <sebix@sebix.at>
By `Tom Nicholas <https://github.com/TomNicholas>`_. By `Tom Nicholas <https://github.com/TomNicholas>`_.
------------------------------------------------------------------- -------------------------------------------------------------------
Sun May 9 09:46:42 UTC 2021 - Sebastian Wagner <sebix@sebix.at> Sun May 9 09:46:42 UTC 2021 - Sebastian Wagner <sebix+novell.com@sebix.at>
- update to version 0.18.0: - update to version 0.18.0:
- This release brings a few important performance improvements, a wide range of: - This release brings a few important performance improvements, a wide range of:
@ -2063,7 +1842,7 @@ Sun May 9 09:46:42 UTC 2021 - Sebastian Wagner <sebix@sebix.at>
By `Maximilian Roos <https://github.com/max-sixty>`_. By `Maximilian Roos <https://github.com/max-sixty>`_.
------------------------------------------------------------------- -------------------------------------------------------------------
Sat Feb 27 14:38:41 UTC 2021 - Sebastian Wagner <sebix@sebix.at> Sat Feb 27 14:38:41 UTC 2021 - Sebastian Wagner <sebix+novell.com@sebix.at>
- update to version 0.17.0: - update to version 0.17.0:
- This release brings a few important performance improvements, a wide range of: - This release brings a few important performance improvements, a wide range of:
@ -2279,7 +2058,7 @@ Mon Feb 15 12:30:53 UTC 2021 - Ben Greiner <code@bnavigator.de>
- Recommend/Suggest the extras - Recommend/Suggest the extras
------------------------------------------------------------------- -------------------------------------------------------------------
Sun Dec 20 16:09:14 UTC 2020 - Sebastian Wagner <sebix@sebix.at> Sun Dec 20 16:09:14 UTC 2020 - Sebastian Wagner <sebix+novell.com@sebix.at>
- update to version 0.16.2: - update to version 0.16.2:
- This release brings the ability to write to limited regions of ``zarr`` files, open zarr files with :py:func:`open_dataset` and :py:func:`open_mfdataset`, increased support for propagating ``attrs`` using the ``keep_attrs`` flag, as well as numerous bugfixes and documentation improvements.: - This release brings the ability to write to limited regions of ``zarr`` files, open zarr files with :py:func:`open_dataset` and :py:func:`open_mfdataset`, increased support for propagating ``attrs`` using the ``keep_attrs`` flag, as well as numerous bugfixes and documentation improvements.:
@ -2392,7 +2171,7 @@ Sun Dec 20 16:09:14 UTC 2020 - Sebastian Wagner <sebix@sebix.at>
<https://github.com/mathause>`_. <https://github.com/mathause>`_.
------------------------------------------------------------------- -------------------------------------------------------------------
Wed Sep 23 06:23:20 UTC 2020 - Sebastian Wagner <sebix@sebix.at> Wed Sep 23 06:23:20 UTC 2020 - Sebastian Wagner <sebix+novell.com@sebix.at>
- update to version 0.16.1: - update to version 0.16.1:
- This patch release fixes an incompatability with a recent pandas change, which: - This patch release fixes an incompatability with a recent pandas change, which:
@ -2633,7 +2412,7 @@ Sat Mar 28 16:43:47 UTC 2020 - Arun Persaud <arun@gmx.de>
tested instead of skipping the tests. tested instead of skipping the tests.
------------------------------------------------------------------- -------------------------------------------------------------------
Sat Feb 1 15:02:10 UTC 2020 - Sebastian Wagner <sebix@sebix.at> Sat Feb 1 15:02:10 UTC 2020 - Sebastian Wagner <sebix+novell.com@sebix.at>
- update to version 0.15.0: - update to version 0.15.0:
- This release brings many improvements to xarray's documentation: our examples are now binderized notebooks (`click here <https://mybinder.org/v2/gh/pydata/xarray/master?urlpath=lab/tree/doc/examples/weather-data.ipynb>`_): - This release brings many improvements to xarray's documentation: our examples are now binderized notebooks (`click here <https://mybinder.org/v2/gh/pydata/xarray/master?urlpath=lab/tree/doc/examples/weather-data.ipynb>`_):
@ -3081,7 +2860,7 @@ Mon Jul 29 21:32:02 UTC 2019 - Todd R <toddrme2178@gmail.com>
- Disable non-functional dask tests - Disable non-functional dask tests
------------------------------------------------------------------- -------------------------------------------------------------------
Mon Jul 15 19:31:13 UTC 2019 - Sebastian Wagner <sebix@sebix.at> Mon Jul 15 19:31:13 UTC 2019 - Sebastian Wagner <sebix+novell.com@sebix.at>
- update to version 0.12.3: - update to version 0.12.3:
- New functions/methods: - New functions/methods:
@ -3106,7 +2885,7 @@ Mon Jul 15 19:31:13 UTC 2019 - Sebastian Wagner <sebix@sebix.at>
once (:issue:`2954`). once (:issue:`2954`).
------------------------------------------------------------------- -------------------------------------------------------------------
Sun Jun 30 09:29:36 UTC 2019 - Sebastian Wagner <sebix@sebix.at> Sun Jun 30 09:29:36 UTC 2019 - Sebastian Wagner <sebix+novell.com@sebix.at>
- update to version 0.12.2: - update to version 0.12.2:
- New functions/methods: - New functions/methods:
@ -3253,7 +3032,7 @@ Tue Apr 23 09:44:22 UTC 2019 - Tomáš Chvátal <tchvatal@suse.com>
- Just use %pytest macro - Just use %pytest macro
------------------------------------------------------------------- -------------------------------------------------------------------
Sun Apr 7 11:37:34 UTC 2019 - Sebastian Wagner <sebix@sebix.at> Sun Apr 7 11:37:34 UTC 2019 - Sebastian Wagner <sebix+novell.com@sebix.at>
- Update to version 0.12.1: - Update to version 0.12.1:
- Enhancements - Enhancements
@ -3369,7 +3148,7 @@ Wed Feb 13 18:04:03 UTC 2019 - Todd R <toddrme2178@gmail.com>
to ``open_mfdataset`` to ``open_mfdataset``
------------------------------------------------------------------- -------------------------------------------------------------------
Thu Jan 3 17:40:46 UTC 2019 - Sebastian Wagner <sebix@sebix.at> Thu Jan 3 17:40:46 UTC 2019 - Sebastian Wagner <sebix+novell.com@sebix.at>
- update to version 0.11.2: - update to version 0.11.2:
- Removes inadvertently introduced setup dependency on pytest-runner (:issue:`2641`). Otherwise, this release is exactly equivalent to 0.11.1. - Removes inadvertently introduced setup dependency on pytest-runner (:issue:`2641`). Otherwise, this release is exactly equivalent to 0.11.1.
@ -3574,7 +3353,7 @@ Wed Jul 18 17:32:13 UTC 2018 - arun@gmx.de
Hamman. Hamman.
------------------------------------------------------------------- -------------------------------------------------------------------
Sun Jun 10 20:04:48 UTC 2018 - sebix@sebix.at Sun Jun 10 20:04:48 UTC 2018 - sebix+novell.com@sebix.at
- update to version 0.10.7: - update to version 0.10.7:
* Enhancements: * Enhancements:
@ -3694,7 +3473,7 @@ Mon May 21 04:03:17 UTC 2018 - arun@gmx.de
Root. Root.
------------------------------------------------------------------- -------------------------------------------------------------------
Sat Apr 14 12:41:49 UTC 2018 - sebix@sebix.at Sat Apr 14 12:41:49 UTC 2018 - sebix+novell.com@sebix.at
- temporarily deactivated tests because of minor issues with netCDF library - temporarily deactivated tests because of minor issues with netCDF library
see https://github.com/pydata/xarray/issues/2050 see https://github.com/pydata/xarray/issues/2050
@ -3794,7 +3573,7 @@ Sun Mar 4 09:34:18 UTC 2018 - jengelh@inai.de
- Replace future goals and aims by present capabilities. - Replace future goals and aims by present capabilities.
------------------------------------------------------------------- -------------------------------------------------------------------
Thu Mar 1 11:44:58 UTC 2018 - sebix@sebix.at Thu Mar 1 11:44:58 UTC 2018 - sebix+novell.com@sebix.at
- update to version 0.10.1: - update to version 0.10.1:
* please see upstream changelog at: https://github.com/pydata/xarray/blob/v0.10.1/doc/whats-new.rst * please see upstream changelog at: https://github.com/pydata/xarray/blob/v0.10.1/doc/whats-new.rst

View File

@ -25,11 +25,11 @@
%define psuffix %{nil} %define psuffix %{nil}
%endif %endif
%define ghversion 2024.11.0 %define ghversion 2024.07.0
%{?sle15_python_module_pythons} %{?sle15_python_module_pythons}
Name: python-xarray%{psuffix} Name: python-xarray%{psuffix}
Version: 2024.11.0 Version: 2024.7.0
Release: 0 Release: 0
Summary: N-D labeled arrays and datasets in Python Summary: N-D labeled arrays and datasets in Python
License: Apache-2.0 License: Apache-2.0
@ -38,6 +38,12 @@ Source: https://github.com/pydata/xarray/archive/refs/tags/v%{ghversion}
# PATCH-FEATURE-UPSTREAM local_dataset.patch gh#pydata/xarray#5377 mcepl@suse.com # PATCH-FEATURE-UPSTREAM local_dataset.patch gh#pydata/xarray#5377 mcepl@suse.com
# fix xr.tutorial.open_dataset to work with the preloaded cache. # fix xr.tutorial.open_dataset to work with the preloaded cache.
Patch0: local_dataset.patch Patch0: local_dataset.patch
# PATCH-FIX-UPSTREAM xarray-pr9321-dasktests.patch gh#pydata/xarray#9321
Patch1: xarray-pr9321-dasktests.patch
# PATCH-FIX-UPSTREAM xarray-pr9356-dasktests.patch gh#pydata/xarray#9356
Patch2: xarray-pr9356-dasktests.patch
# PATCH-FIX-UPSTREAM xarray-pr9403-np2.1-scalar.patch gh#pydata/xarray#9403
Patch3: xarray-pr9403-np2.1-scalar.patch
BuildRequires: %{python_module base >= 3.9} BuildRequires: %{python_module base >= 3.9}
BuildRequires: %{python_module pip} BuildRequires: %{python_module pip}
BuildRequires: %{python_module setuptools_scm} BuildRequires: %{python_module setuptools_scm}
@ -45,9 +51,9 @@ BuildRequires: %{python_module setuptools}
BuildRequires: %{python_module wheel} BuildRequires: %{python_module wheel}
BuildRequires: fdupes BuildRequires: fdupes
BuildRequires: python-rpm-macros BuildRequires: python-rpm-macros
Requires: python-numpy >= 1.24 Requires: python-numpy >= 1.23
Requires: python-packaging >= 23.1 Requires: python-packaging >= 23.1
Requires: python-pandas >= 2.1 Requires: python-pandas >= 2
Obsoletes: python-xray <= 0.7 Obsoletes: python-xray <= 0.7
BuildArch: noarch BuildArch: noarch
%if %{with test} %if %{with test}
@ -70,11 +76,11 @@ The dataset is an in-memory representation of a netCDF file.
Summary: The python xarray[accel] extra Summary: The python xarray[accel] extra
Requires: python-Bottleneck Requires: python-Bottleneck
Requires: python-opt-einsum Requires: python-opt-einsum
Requires: python-scipy >= 1.11 Requires: python-scipy
Requires: python-xarray = %{version} Requires: python-xarray = %{version}
# not available yet # not available yet
Recommends: python-flox Recommends: python-flox
Recommends: python-numbagg >= 0.6 Recommends: python-numbagg
%description accel %description accel
The [accel] extra for xarray, N-D labeled arrays and datasets in Python The [accel] extra for xarray, N-D labeled arrays and datasets in Python
@ -87,7 +93,7 @@ Requires: python-xarray = %{version}
Requires: python-xarray-accel = %{version} Requires: python-xarray-accel = %{version}
Requires: python-xarray-dev = %{version} Requires: python-xarray-dev = %{version}
Requires: python-xarray-io = %{version} Requires: python-xarray-io = %{version}
#Requires: python-xarray-parallel = %%{version} Requires: python-xarray-parallel = %{version}
Requires: python-xarray-viz = %{version} Requires: python-xarray-viz = %{version}
%description complete %description complete
@ -115,30 +121,24 @@ Except pre-commit, Use `pip-%{python_bin_suffix} --user install pre-commit` to i
Summary: The python xarray[io] extra Summary: The python xarray[io] extra
Requires: python-cftime Requires: python-cftime
Requires: python-fsspec Requires: python-fsspec
Requires: python-h5netcdf >= 1.3 Requires: python-h5netcdf
Requires: python-netCDF4 Requires: python-netCDF4
Requires: python-pooch Requires: python-pooch
Requires: python-scipy >= 1.11 Requires: python-scipy
Requires: python-xarray = %{version} Requires: python-xarray = %{version}
Requires: python-zarr >= 2.16 Requires: python-zarr
%description io %description io
The [io] extra for xarray, N-D labeled arrays and datasets in Python The [io] extra for xarray, N-D labeled arrays and datasets in Python
%package parallel
Summary: The python xarray[parallel] extra
Requires: python-dask-complete
Requires: python-xarray = %{version}
%description parallel
The [parallel] extra for xarray, N-D labeled arrays and datasets in Python
#%%package parallel
#Summary: The python xarray[parallel] extra
#Requires: python-dask-complete >= 2023.11
#Requires: python-xarray = %%{version}
#
#%description parallel
#The [parallel] extra for xarray, N-D labeled arrays and datasets in Python
%package viz %package viz
Summary: The python xarray[viz] extra Summary: The python xarray[viz] extra
Requires: python-matplotlib Requires: python-matplotlib
@ -179,15 +179,12 @@ if [ $(getconf LONG_BIT) -eq 32 ]; then
donttest="$donttest or (test_interpolate_chunk_advanced and linear)" donttest="$donttest or (test_interpolate_chunk_advanced and linear)"
# tests for 64bit types # tests for 64bit types
donttest="$donttest or TestZarrDictStore or TestZarrDirectoryStore or TestZarrWriteEmpty" donttest="$donttest or TestZarrDictStore or TestZarrDirectoryStore or TestZarrWriteEmpty"
donttest="$donttest or test_repr_multiindex or test_array_repr_dtypes_unix or test_asi8" donttest="$donttest or test_repr_multiindex or test_array_repr_dtypes_unix"
donttest="$donttest or (test_datatree and TestRepr and test_doc_example)"
fi fi
# h5py was built without ROS3 support, can't use ros3 driver # h5py was built without ROS3 support, can't use ros3 driver
donttest="$donttest or TestH5NetCDFDataRos3Driver" donttest="$donttest or TestH5NetCDFDataRos3Driver"
# NetCDF4 fails with these unsupported drivers # NetCDF4 fails with these unsupported drivers
donttest="$donttest or (TestNetCDF4 and test_compression_encoding and (szip or zstd or blosc_lz or blosc_zlib))" donttest="$donttest or (TestNetCDF4 and test_compression_encoding and (szip or zstd or blosc_lz or blosc_zlib))"
# skip parallelcompat as the 'parallel' subpackage is not built (see changes file)
donttest="$donttest or test_h5netcdf_storage_options or test_source_encoding_always_present_with_fsspec"
%pytest -n auto -rsEf -k "not ($donttest)" xarray %pytest -n auto -rsEf -k "not ($donttest)" xarray
%endif %endif
@ -215,9 +212,9 @@ donttest="$donttest or test_h5netcdf_storage_options or test_source_encoding_alw
%doc README.md %doc README.md
%license LICENSE %license LICENSE
#%%files %%{python_files parallel} %files %{python_files parallel}
#%doc README.md %doc README.md
#%%license LICENSE %license LICENSE
%files %{python_files viz} %files %{python_files viz}
%doc README.md %doc README.md

View File

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:604a801c7bba96524b09b993b159ae998e0987627949b667c884f65895939f11
size 3739324

View File

@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3db5160a699a7731fba26b42aef3f175ca3a6adfe5593bebd0b7af90e55d747d
size 3756547

View File

@ -1,3 +0,0 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b3b22095e1069ec6dee4694fd2da64147eac75eb1161206b8f970841786e0056
size 3236300

772
xarray-pr8854-np2.patch Normal file
View File

@ -0,0 +1,772 @@
From e066a6c559e9d7f31c359ea95da42d0e45c585ce Mon Sep 17 00:00:00 2001
From: Justus Magin <keewis@posteo.de>
Date: Tue, 19 Mar 2024 11:32:32 +0100
Subject: [PATCH 01/65] replace the use of `numpy.array_api` with
`array_api_strict`
This would make it a dependency of `namedarray`, and not allow
behavior that is allowed but not required by the array API standard. Otherwise we can:
- use the main `numpy` namespace
- use `array_api_compat` (would also be a new dependency) to allow
optional behavior
---
xarray/namedarray/_array_api.py | 9 ---------
1 file changed, 9 deletions(-)
Index: xarray-2024.05.0/xarray/namedarray/_array_api.py
===================================================================
--- xarray-2024.05.0.orig/xarray/namedarray/_array_api.py
+++ xarray-2024.05.0/xarray/namedarray/_array_api.py
@@ -1,6 +1,5 @@
from __future__ import annotations
-import warnings
from types import ModuleType
from typing import Any
@@ -21,14 +20,6 @@ from xarray.namedarray._typing import (
)
from xarray.namedarray.core import NamedArray
-with warnings.catch_warnings():
- warnings.filterwarnings(
- "ignore",
- r"The numpy.array_api submodule is still experimental",
- category=UserWarning,
- )
- import numpy.array_api as nxp # noqa: F401
-
def _get_data_namespace(x: NamedArray[Any, Any]) -> ModuleType:
if isinstance(x._data, _arrayapi):
@@ -68,13 +59,13 @@ def astype(
Examples
--------
- >>> narr = NamedArray(("x",), nxp.asarray([1.5, 2.5]))
+ >>> narr = NamedArray(("x",), np.asarray([1.5, 2.5]))
>>> narr
<xarray.NamedArray (x: 2)> Size: 16B
- Array([1.5, 2.5], dtype=float64)
+ array([1.5, 2.5])
>>> astype(narr, np.dtype(np.int32))
<xarray.NamedArray (x: 2)> Size: 8B
- Array([1, 2], dtype=int32)
+ array([1, 2], dtype=int32)
"""
if isinstance(x._data, _arrayapi):
xp = x._data.__array_namespace__()
@@ -109,7 +100,7 @@ def imag(
Examples
--------
- >>> narr = NamedArray(("x",), np.asarray([1.0 + 2j, 2 + 4j])) # TODO: Use nxp
+ >>> narr = NamedArray(("x",), np.asarray([1.0 + 2j, 2 + 4j]))
>>> imag(narr)
<xarray.NamedArray (x: 2)> Size: 16B
array([2., 4.])
@@ -141,7 +132,7 @@ def real(
Examples
--------
- >>> narr = NamedArray(("x",), np.asarray([1.0 + 2j, 2 + 4j])) # TODO: Use nxp
+ >>> narr = NamedArray(("x",), np.asarray([1.0 + 2j, 2 + 4j]))
>>> real(narr)
<xarray.NamedArray (x: 2)> Size: 16B
array([1., 2.])
@@ -179,15 +170,15 @@ def expand_dims(
Examples
--------
- >>> x = NamedArray(("x", "y"), nxp.asarray([[1.0, 2.0], [3.0, 4.0]]))
+ >>> x = NamedArray(("x", "y"), np.asarray([[1.0, 2.0], [3.0, 4.0]]))
>>> expand_dims(x)
<xarray.NamedArray (dim_2: 1, x: 2, y: 2)> Size: 32B
- Array([[[1., 2.],
- [3., 4.]]], dtype=float64)
+ array([[[1., 2.],
+ [3., 4.]]])
>>> expand_dims(x, dim="z")
<xarray.NamedArray (z: 1, x: 2, y: 2)> Size: 32B
- Array([[[1., 2.],
- [3., 4.]]], dtype=float64)
+ array([[[1., 2.],
+ [3., 4.]]])
"""
xp = _get_data_namespace(x)
dims = x.dims
Index: xarray-2024.05.0/xarray/tests/__init__.py
===================================================================
--- xarray-2024.05.0.orig/xarray/tests/__init__.py
+++ xarray-2024.05.0/xarray/tests/__init__.py
@@ -147,9 +147,10 @@ has_numbagg_or_bottleneck = has_numbagg
requires_numbagg_or_bottleneck = pytest.mark.skipif(
not has_scipy_or_netCDF4, reason="requires scipy or netCDF4"
)
-has_numpy_array_api, requires_numpy_array_api = _importorskip("numpy", "1.26.0")
has_numpy_2, requires_numpy_2 = _importorskip("numpy", "2.0.0")
+has_array_api_strict, requires_array_api_strict = _importorskip("array_api_strict")
+
def _importorskip_h5netcdf_ros3():
try:
Index: xarray-2024.05.0/xarray/tests/test_array_api.py
===================================================================
--- xarray-2024.05.0.orig/xarray/tests/test_array_api.py
+++ xarray-2024.05.0/xarray/tests/test_array_api.py
@@ -6,20 +6,9 @@ import xarray as xr
from xarray.testing import assert_equal
np = pytest.importorskip("numpy", minversion="1.22")
+xp = pytest.importorskip("array_api_strict")
-try:
- import warnings
-
- with warnings.catch_warnings():
- warnings.simplefilter("ignore")
-
- import numpy.array_api as xp
- from numpy.array_api._array_object import Array
-except ImportError:
- # for `numpy>=2.0`
- xp = pytest.importorskip("array_api_strict")
-
- from array_api_strict._array_object import Array # type: ignore[no-redef]
+from array_api_strict._array_object import Array # isort:skip # type: ignore[no-redef]
@pytest.fixture
@@ -65,8 +54,8 @@ def test_aggregation_skipna(arrays) -> N
def test_astype(arrays) -> None:
np_arr, xp_arr = arrays
expected = np_arr.astype(np.int64)
- actual = xp_arr.astype(np.int64)
- assert actual.dtype == np.int64
+ actual = xp_arr.astype(xp.int64)
+ assert actual.dtype == xp.int64
assert isinstance(actual.data, Array)
assert_equal(actual, expected)
@@ -118,8 +107,10 @@ def test_indexing(arrays: tuple[xr.DataA
def test_properties(arrays: tuple[xr.DataArray, xr.DataArray]) -> None:
np_arr, xp_arr = arrays
- assert np_arr.nbytes == np_arr.data.nbytes
- assert xp_arr.nbytes == np_arr.data.nbytes
+
+ expected = np_arr.data.nbytes
+ assert np_arr.nbytes == expected
+ assert xp_arr.nbytes == expected
def test_reorganizing_operation(arrays: tuple[xr.DataArray, xr.DataArray]) -> None:
Index: xarray-2024.05.0/xarray/tests/test_namedarray.py
===================================================================
--- xarray-2024.05.0.orig/xarray/tests/test_namedarray.py
+++ xarray-2024.05.0/xarray/tests/test_namedarray.py
@@ -1,7 +1,6 @@
from __future__ import annotations
import copy
-import warnings
from abc import abstractmethod
from collections.abc import Mapping
from typing import TYPE_CHECKING, Any, Generic, cast, overload
@@ -79,6 +78,17 @@ class CustomArrayIndexable(
return np
+def check_duck_array_typevar(a: duckarray[Any, _DType]) -> duckarray[Any, _DType]:
+ # Mypy checks a is valid:
+ b: duckarray[Any, _DType] = a
+
+ # Runtime check if valid:
+ if isinstance(b, _arrayfunction_or_api):
+ return b
+ else:
+ raise TypeError(f"a ({type(a)}) is not a valid _arrayfunction or _arrayapi")
+
+
class NamedArraySubclassobjects:
@pytest.fixture
def target(self, data: np.ndarray[Any, Any]) -> Any:
@@ -328,48 +338,27 @@ class TestNamedArray(NamedArraySubclasso
named_array.dims = new_dims
assert named_array.dims == tuple(new_dims)
- def test_duck_array_class(
- self,
- ) -> None:
- def test_duck_array_typevar(
- a: duckarray[Any, _DType],
- ) -> duckarray[Any, _DType]:
- # Mypy checks a is valid:
- b: duckarray[Any, _DType] = a
-
- # Runtime check if valid:
- if isinstance(b, _arrayfunction_or_api):
- return b
- else:
- raise TypeError(
- f"a ({type(a)}) is not a valid _arrayfunction or _arrayapi"
- )
-
+ def test_duck_array_class(self) -> None:
numpy_a: NDArray[np.int64]
numpy_a = np.array([2.1, 4], dtype=np.dtype(np.int64))
- test_duck_array_typevar(numpy_a)
+ check_duck_array_typevar(numpy_a)
masked_a: np.ma.MaskedArray[Any, np.dtype[np.int64]]
masked_a = np.ma.asarray([2.1, 4], dtype=np.dtype(np.int64)) # type: ignore[no-untyped-call]
- test_duck_array_typevar(masked_a)
+ check_duck_array_typevar(masked_a)
custom_a: CustomArrayIndexable[Any, np.dtype[np.int64]]
custom_a = CustomArrayIndexable(numpy_a)
- test_duck_array_typevar(custom_a)
+ check_duck_array_typevar(custom_a)
+ def test_duck_array_class_array_api(self) -> None:
# Test numpy's array api:
- with warnings.catch_warnings():
- warnings.filterwarnings(
- "ignore",
- r"The numpy.array_api submodule is still experimental",
- category=UserWarning,
- )
- import numpy.array_api as nxp
+ nxp = pytest.importorskip("array_api_strict", minversion="1.0")
# TODO: nxp doesn't use dtype typevars, so can only use Any for the moment:
arrayapi_a: duckarray[Any, Any] # duckarray[Any, np.dtype[np.int64]]
- arrayapi_a = nxp.asarray([2.1, 4], dtype=np.dtype(np.int64))
- test_duck_array_typevar(arrayapi_a)
+ arrayapi_a = nxp.asarray([2.1, 4], dtype=nxp.int64)
+ check_duck_array_typevar(arrayapi_a)
def test_new_namedarray(self) -> None:
dtype_float = np.dtype(np.float32)
Index: xarray-2024.05.0/xarray/tests/test_strategies.py
===================================================================
--- xarray-2024.05.0.orig/xarray/tests/test_strategies.py
+++ xarray-2024.05.0/xarray/tests/test_strategies.py
@@ -1,6 +1,9 @@
+import warnings
+
import numpy as np
import numpy.testing as npt
import pytest
+from packaging.version import Version
pytest.importorskip("hypothesis")
# isort: split
@@ -19,7 +22,6 @@ from xarray.testing.strategies import (
unique_subset_of,
variables,
)
-from xarray.tests import requires_numpy_array_api
ALLOWED_ATTRS_VALUES_TYPES = (int, bool, str, np.ndarray)
@@ -199,7 +201,6 @@ class TestVariablesStrategy:
)
)
- @requires_numpy_array_api
@given(st.data())
def test_make_strategies_namespace(self, data):
"""
@@ -208,16 +209,24 @@ class TestVariablesStrategy:
We still want to generate dtypes not in the array API by default, but this checks we don't accidentally override
the user's choice of dtypes with non-API-compliant ones.
"""
- from numpy import (
- array_api as np_array_api, # requires numpy>=1.26.0, and we expect a UserWarning to be raised
- )
+ if Version(np.__version__) >= Version("2.0.0.dev0"):
+ nxp = np
+ else:
+ # requires numpy>=1.26.0, and we expect a UserWarning to be raised
+ with warnings.catch_warnings():
+ warnings.filterwarnings(
+ "ignore", category=UserWarning, message=".+See NEP 47."
+ )
+ from numpy import ( # type: ignore[no-redef,unused-ignore]
+ array_api as nxp,
+ )
- np_array_api_st = make_strategies_namespace(np_array_api)
+ nxp_st = make_strategies_namespace(nxp)
data.draw(
variables(
- array_strategy_fn=np_array_api_st.arrays,
- dtype=np_array_api_st.scalar_dtypes(),
+ array_strategy_fn=nxp_st.arrays,
+ dtype=nxp_st.scalar_dtypes(),
)
)
Index: xarray-2024.05.0/xarray/core/duck_array_ops.py
===================================================================
--- xarray-2024.05.0.orig/xarray/core/duck_array_ops.py
+++ xarray-2024.05.0/xarray/core/duck_array_ops.py
@@ -142,17 +142,25 @@ around.__doc__ = str.replace(
def isnull(data):
data = asarray(data)
- scalar_type = data.dtype.type
- if issubclass(scalar_type, (np.datetime64, np.timedelta64)):
+
+ xp = get_array_namespace(data)
+ scalar_type = data.dtype
+ if dtypes.is_datetime_like(scalar_type):
# datetime types use NaT for null
# note: must check timedelta64 before integers, because currently
# timedelta64 inherits from np.integer
return isnat(data)
- elif issubclass(scalar_type, np.inexact):
+ elif dtypes.isdtype(scalar_type, ("real floating", "complex floating"), xp=xp):
# float types use NaN for null
xp = get_array_namespace(data)
return xp.isnan(data)
- elif issubclass(scalar_type, (np.bool_, np.integer, np.character, np.void)):
+ elif dtypes.isdtype(scalar_type, ("bool", "integral"), xp=xp) or (
+ isinstance(scalar_type, np.dtype)
+ and (
+ np.issubdtype(scalar_type, np.character)
+ or np.issubdtype(scalar_type, np.void)
+ )
+ ):
# these types cannot represent missing values
return full_like(data, dtype=bool, fill_value=False)
else:
@@ -406,13 +414,22 @@ def _create_nan_agg_method(name, coerce_
if invariant_0d and axis == ():
return values
- values = asarray(values)
+ xp = get_array_namespace(values)
+ values = asarray(values, xp=xp)
- if coerce_strings and values.dtype.kind in "SU":
+ if coerce_strings and dtypes.is_string(values.dtype):
values = astype(values, object)
func = None
- if skipna or (skipna is None and values.dtype.kind in "cfO"):
+ if skipna or (
+ skipna is None
+ and (
+ dtypes.isdtype(
+ values.dtype, ("complex floating", "real floating"), xp=xp
+ )
+ or dtypes.is_object(values.dtype)
+ )
+ ):
nanname = "nan" + name
func = getattr(nanops, nanname)
else:
@@ -477,8 +494,8 @@ def _datetime_nanmin(array):
- numpy nanmin() don't work on datetime64 (all versions at the moment of writing)
- dask min() does not work on datetime64 (all versions at the moment of writing)
"""
- assert array.dtype.kind in "mM"
dtype = array.dtype
+ assert dtypes.is_datetime_like(dtype)
# (NaT).astype(float) does not produce NaN...
array = where(pandas_isnull(array), np.nan, array.astype(float))
array = min(array, skipna=True)
@@ -515,7 +532,7 @@ def datetime_to_numeric(array, offset=No
"""
# Set offset to minimum if not given
if offset is None:
- if array.dtype.kind in "Mm":
+ if dtypes.is_datetime_like(array.dtype):
offset = _datetime_nanmin(array)
else:
offset = min(array)
@@ -527,7 +544,7 @@ def datetime_to_numeric(array, offset=No
# This map_blocks call is for backwards compatibility.
# dask == 2021.04.1 does not support subtracting object arrays
# which is required for cftime
- if is_duck_dask_array(array) and np.issubdtype(array.dtype, object):
+ if is_duck_dask_array(array) and dtypes.is_object(array.dtype):
array = array.map_blocks(lambda a, b: a - b, offset, meta=array._meta)
else:
array = array - offset
@@ -537,11 +554,11 @@ def datetime_to_numeric(array, offset=No
array = np.array(array)
# Convert timedelta objects to float by first converting to microseconds.
- if array.dtype.kind in "O":
+ if dtypes.is_object(array.dtype):
return py_timedelta_to_float(array, datetime_unit or "ns").astype(dtype)
# Convert np.NaT to np.nan
- elif array.dtype.kind in "mM":
+ elif dtypes.is_datetime_like(array.dtype):
# Convert to specified timedelta units.
if datetime_unit:
array = array / np.timedelta64(1, datetime_unit)
@@ -641,7 +658,7 @@ def mean(array, axis=None, skipna=None,
from xarray.core.common import _contains_cftime_datetimes
array = asarray(array)
- if array.dtype.kind in "Mm":
+ if dtypes.is_datetime_like(array.dtype):
offset = _datetime_nanmin(array)
# xarray always uses np.datetime64[ns] for np.datetime64 data
@@ -689,7 +706,9 @@ def cumsum(array, axis=None, **kwargs):
def first(values, axis, skipna=None):
"""Return the first non-NA elements in this array along the given axis"""
- if (skipna or skipna is None) and values.dtype.kind not in "iSU":
+ if (skipna or skipna is None) and not (
+ dtypes.isdtype(values.dtype, "signed integer") or dtypes.is_string(values.dtype)
+ ):
# only bother for dtypes that can hold NaN
if is_chunked_array(values):
return chunked_nanfirst(values, axis)
@@ -700,7 +719,9 @@ def first(values, axis, skipna=None):
def last(values, axis, skipna=None):
"""Return the last non-NA elements in this array along the given axis"""
- if (skipna or skipna is None) and values.dtype.kind not in "iSU":
+ if (skipna or skipna is None) and not (
+ dtypes.isdtype(values.dtype, "signed integer") or dtypes.is_string(values.dtype)
+ ):
# only bother for dtypes that can hold NaN
if is_chunked_array(values):
return chunked_nanlast(values, axis)
Index: xarray-2024.05.0/xarray/core/dtypes.py
===================================================================
--- xarray-2024.05.0.orig/xarray/core/dtypes.py
+++ xarray-2024.05.0/xarray/core/dtypes.py
@@ -4,8 +4,9 @@ import functools
from typing import Any
import numpy as np
+from pandas.api.types import is_extension_array_dtype
-from xarray.core import utils
+from xarray.core import npcompat, utils
# Use as a sentinel value to indicate a dtype appropriate NA value.
NA = utils.ReprObject("<NA>")
@@ -60,22 +61,22 @@ def maybe_promote(dtype: np.dtype) -> tu
# N.B. these casting rules should match pandas
dtype_: np.typing.DTypeLike
fill_value: Any
- if np.issubdtype(dtype, np.floating):
+ if isdtype(dtype, "real floating"):
dtype_ = dtype
fill_value = np.nan
- elif np.issubdtype(dtype, np.timedelta64):
+ elif isinstance(dtype, np.dtype) and np.issubdtype(dtype, np.timedelta64):
# See https://github.com/numpy/numpy/issues/10685
# np.timedelta64 is a subclass of np.integer
# Check np.timedelta64 before np.integer
fill_value = np.timedelta64("NaT")
dtype_ = dtype
- elif np.issubdtype(dtype, np.integer):
+ elif isdtype(dtype, "integral"):
dtype_ = np.float32 if dtype.itemsize <= 2 else np.float64
fill_value = np.nan
- elif np.issubdtype(dtype, np.complexfloating):
+ elif isdtype(dtype, "complex floating"):
dtype_ = dtype
fill_value = np.nan + np.nan * 1j
- elif np.issubdtype(dtype, np.datetime64):
+ elif isinstance(dtype, np.dtype) and np.issubdtype(dtype, np.datetime64):
dtype_ = dtype
fill_value = np.datetime64("NaT")
else:
@@ -118,16 +119,16 @@ def get_pos_infinity(dtype, max_for_int=
-------
fill_value : positive infinity value corresponding to this dtype.
"""
- if issubclass(dtype.type, np.floating):
+ if isdtype(dtype, "real floating"):
return np.inf
- if issubclass(dtype.type, np.integer):
+ if isdtype(dtype, "integral"):
if max_for_int:
return np.iinfo(dtype).max
else:
return np.inf
- if issubclass(dtype.type, np.complexfloating):
+ if isdtype(dtype, "complex floating"):
return np.inf + 1j * np.inf
return INF
@@ -146,24 +147,66 @@ def get_neg_infinity(dtype, min_for_int=
-------
fill_value : positive infinity value corresponding to this dtype.
"""
- if issubclass(dtype.type, np.floating):
+ if isdtype(dtype, "real floating"):
return -np.inf
- if issubclass(dtype.type, np.integer):
+ if isdtype(dtype, "integral"):
if min_for_int:
return np.iinfo(dtype).min
else:
return -np.inf
- if issubclass(dtype.type, np.complexfloating):
+ if isdtype(dtype, "complex floating"):
return -np.inf - 1j * np.inf
return NINF
-def is_datetime_like(dtype):
+def is_datetime_like(dtype) -> bool:
"""Check if a dtype is a subclass of the numpy datetime types"""
- return np.issubdtype(dtype, np.datetime64) or np.issubdtype(dtype, np.timedelta64)
+ return _is_numpy_subdtype(dtype, (np.datetime64, np.timedelta64))
+
+
+def is_object(dtype) -> bool:
+ """Check if a dtype is object"""
+ return _is_numpy_subdtype(dtype, object)
+
+
+def is_string(dtype) -> bool:
+ """Check if a dtype is a string dtype"""
+ return _is_numpy_subdtype(dtype, (np.str_, np.character))
+
+
+def _is_numpy_subdtype(dtype, kind) -> bool:
+ if not isinstance(dtype, np.dtype):
+ return False
+
+ kinds = kind if isinstance(kind, tuple) else (kind,)
+ return any(np.issubdtype(dtype, kind) for kind in kinds)
+
+
+def isdtype(dtype, kind: str | tuple[str, ...], xp=None) -> bool:
+ """Compatibility wrapper for isdtype() from the array API standard.
+
+ Unlike xp.isdtype(), kind must be a string.
+ """
+ # TODO(shoyer): remove this wrapper when Xarray requires
+ # numpy>=2 and pandas extensions arrays are implemented in
+ # Xarray via the array API
+ if not isinstance(kind, str) and not (
+ isinstance(kind, tuple) and all(isinstance(k, str) for k in kind)
+ ):
+ raise TypeError(f"kind must be a string or a tuple of strings: {repr(kind)}")
+
+ if isinstance(dtype, np.dtype):
+ return npcompat.isdtype(dtype, kind)
+ elif is_extension_array_dtype(dtype):
+ # we never want to match pandas extension array dtypes
+ return False
+ else:
+ if xp is None:
+ xp = np
+ return xp.isdtype(dtype, kind)
def result_type(
@@ -184,12 +227,26 @@ def result_type(
-------
numpy.dtype for the result.
"""
- types = {np.result_type(t).type for t in arrays_and_dtypes}
+ from xarray.core.duck_array_ops import get_array_namespace
+
+ # TODO(shoyer): consider moving this logic into get_array_namespace()
+ # or another helper function.
+ namespaces = {get_array_namespace(t) for t in arrays_and_dtypes}
+ non_numpy = namespaces - {np}
+ if non_numpy:
+ [xp] = non_numpy
+ else:
+ xp = np
+
+ types = {xp.result_type(t) for t in arrays_and_dtypes}
- for left, right in PROMOTE_TO_OBJECT:
- if any(issubclass(t, left) for t in types) and any(
- issubclass(t, right) for t in types
- ):
- return np.dtype(object)
+ if any(isinstance(t, np.dtype) for t in types):
+ # only check if there's numpy dtypes the array API does not
+ # define the types we're checking for
+ for left, right in PROMOTE_TO_OBJECT:
+ if any(np.issubdtype(t, left) for t in types) and any(
+ np.issubdtype(t, right) for t in types
+ ):
+ return xp.dtype(object)
- return np.result_type(*arrays_and_dtypes)
+ return xp.result_type(*arrays_and_dtypes)
Index: xarray-2024.05.0/xarray/namedarray/core.py
===================================================================
--- xarray-2024.05.0.orig/xarray/namedarray/core.py
+++ xarray-2024.05.0/xarray/namedarray/core.py
@@ -470,10 +470,28 @@ class NamedArray(NamedArrayAggregations,
If the underlying data array does not include ``nbytes``, estimates
the bytes consumed based on the ``size`` and ``dtype``.
"""
+ from xarray.namedarray._array_api import _get_data_namespace
+
if hasattr(self._data, "nbytes"):
return self._data.nbytes # type: ignore[no-any-return]
+
+ if hasattr(self.dtype, "itemsize"):
+ itemsize = self.dtype.itemsize
+ elif isinstance(self._data, _arrayapi):
+ xp = _get_data_namespace(self)
+
+ if xp.isdtype(self.dtype, "bool"):
+ itemsize = 1
+ elif xp.isdtype(self.dtype, "integral"):
+ itemsize = xp.iinfo(self.dtype).bits // 8
+ else:
+ itemsize = xp.finfo(self.dtype).bits // 8
else:
- return self.size * self.dtype.itemsize
+ raise TypeError(
+ "cannot compute the number of bytes (no array API nor nbytes / itemsize)"
+ )
+
+ return self.size * itemsize
@property
def dims(self) -> _Dims:
Index: xarray-2024.05.0/xarray/tests/test_dtypes.py
===================================================================
--- xarray-2024.05.0.orig/xarray/tests/test_dtypes.py
+++ xarray-2024.05.0/xarray/tests/test_dtypes.py
@@ -4,6 +4,18 @@ import numpy as np
import pytest
from xarray.core import dtypes
+from xarray.tests import requires_array_api_strict
+
+try:
+ import array_api_strict
+except ImportError:
+
+ class DummyArrayAPINamespace:
+ bool = None
+ int32 = None
+ float64 = None
+
+ array_api_strict = DummyArrayAPINamespace
@pytest.mark.parametrize(
@@ -58,7 +70,6 @@ def test_inf(obj) -> None:
@pytest.mark.parametrize(
"kind, expected",
[
- ("a", (np.dtype("O"), "nan")), # dtype('S')
("b", (np.float32, "nan")), # dtype('int8')
("B", (np.float32, "nan")), # dtype('uint8')
("c", (np.dtype("O"), "nan")), # dtype('S1')
@@ -98,3 +109,54 @@ def test_nat_types_membership() -> None:
assert np.datetime64("NaT").dtype in dtypes.NAT_TYPES
assert np.timedelta64("NaT").dtype in dtypes.NAT_TYPES
assert np.float64 not in dtypes.NAT_TYPES
+
+
+@pytest.mark.parametrize(
+ ["dtype", "kinds", "xp", "expected"],
+ (
+ (np.dtype("int32"), "integral", np, True),
+ (np.dtype("float16"), "real floating", np, True),
+ (np.dtype("complex128"), "complex floating", np, True),
+ (np.dtype("U"), "numeric", np, False),
+ pytest.param(
+ array_api_strict.int32,
+ "integral",
+ array_api_strict,
+ True,
+ marks=requires_array_api_strict,
+ id="array_api-int",
+ ),
+ pytest.param(
+ array_api_strict.float64,
+ "real floating",
+ array_api_strict,
+ True,
+ marks=requires_array_api_strict,
+ id="array_api-float",
+ ),
+ pytest.param(
+ array_api_strict.bool,
+ "numeric",
+ array_api_strict,
+ False,
+ marks=requires_array_api_strict,
+ id="array_api-bool",
+ ),
+ ),
+)
+def test_isdtype(dtype, kinds, xp, expected) -> None:
+ actual = dtypes.isdtype(dtype, kinds, xp=xp)
+ assert actual == expected
+
+
+@pytest.mark.parametrize(
+ ["dtype", "kinds", "xp", "error", "pattern"],
+ (
+ (np.dtype("int32"), "foo", np, (TypeError, ValueError), "kind"),
+ (np.dtype("int32"), np.signedinteger, np, TypeError, "kind"),
+ (np.dtype("float16"), 1, np, TypeError, "kind"),
+ ),
+)
+def test_isdtype_error(dtype, kinds, xp, error, pattern):
+ with pytest.raises(error, match=pattern):
+ dtypes.isdtype(dtype, kinds, xp=xp)
Index: xarray-2024.05.0/xarray/core/npcompat.py
===================================================================
--- xarray-2024.05.0.orig/xarray/core/npcompat.py
+++ xarray-2024.05.0/xarray/core/npcompat.py
@@ -28,3 +28,33 @@
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+try:
+ # requires numpy>=2.0
+ from numpy import isdtype # type: ignore[attr-defined,unused-ignore]
+except ImportError:
+ import numpy as np
+
+ dtype_kinds = {
+ "bool": np.bool_,
+ "signed integer": np.signedinteger,
+ "unsigned integer": np.unsignedinteger,
+ "integral": np.integer,
+ "real floating": np.floating,
+ "complex floating": np.complexfloating,
+ "numeric": np.number,
+ }
+
+ def isdtype(dtype, kind):
+ kinds = kind if isinstance(kind, tuple) else (kind,)
+
+ unknown_dtypes = [kind for kind in kinds if kind not in dtype_kinds]
+ if unknown_dtypes:
+ raise ValueError(f"unknown dtype kinds: {unknown_dtypes}")
+
+ # verified the dtypes already, no need to check again
+ translated_kinds = [dtype_kinds[kind] for kind in kinds]
+ if isinstance(dtype, np.generic):
+ return any(isinstance(dtype, kind) for kind in translated_kinds)
+ else:
+ return any(np.issubdtype(dtype, kind) for kind in translated_kinds)

View File

@ -0,0 +1,73 @@
From cc4daebf1a4a41483c6b60fc57d82d8bc30911e5 Mon Sep 17 00:00:00 2001
From: Mark Harfouche <mark.harfouche@gmail.com>
Date: Sat, 18 May 2024 12:54:03 -0400
Subject: [PATCH] Use ME in test_plot instead of M
```
pytest xarray/tests/test_plot.py::TestNcAxisNotInstalled::test_ncaxis_notinstalled_line_plot
```
would return the following error
```
xarray/tests/test_plot.py E [100%]
======================================= ERRORS =======================================
____ ERROR at setup of TestNcAxisNotInstalled.test_ncaxis_notinstalled_line_plot _____
self = <xarray.tests.test_plot.TestNcAxisNotInstalled object at 0x78ed1992aa10>
@pytest.fixture(autouse=True)
def setUp(self) -> None:
"""
Create a DataArray with a time-axis that contains cftime.datetime
objects.
"""
month = np.arange(1, 13, 1)
data = np.sin(2 * np.pi * month / 12.0)
darray = DataArray(data, dims=["time"])
> darray.coords["time"] = xr.cftime_range(
start="2017", periods=12, freq="1M", calendar="noleap"
)
/home/mark/git/xarray/xarray/tests/test_plot.py:3004:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/home/mark/git/xarray/xarray/coding/cftime_offsets.py:1129: in cftime_range
offset = to_offset(freq)
/home/mark/git/xarray/xarray/coding/cftime_offsets.py:767: in to_offset
_emit_freq_deprecation_warning(freq)
/home/mark/git/xarray/xarray/coding/cftime_offsets.py:751: in _emit_freq_deprecation_warning
emit_user_level_warning(message, FutureWarning)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
message = "'M' is deprecated and will be removed in a future version. Please use 'ME' instead of 'M'."
category = <class 'FutureWarning'>
def emit_user_level_warning(message, category=None) -> None:
"""Emit a warning at the user level by inspecting the stack trace."""
stacklevel = find_stack_level()
> return warnings.warn(message, category=category, stacklevel=stacklevel)
E FutureWarning: 'M' is deprecated and will be removed in a future version. Please use 'ME' instead of 'M'.
/home/mark/git/xarray/xarray/core/utils.py:1112: FutureWarning
============================== short test summary info ===============================
ERROR xarray/tests/test_plot.py::TestNcAxisNotInstalled::test_ncaxis_notinstalled_line_plot - FutureWarning: 'M' is deprecated and will be removed in a future version. Please ...
================================== 1 error in 0.64s ==================================
```
---
xarray/tests/test_plot.py | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/xarray/tests/test_plot.py b/xarray/tests/test_plot.py
index e636be5589f..27f4ded5646 100644
--- a/xarray/tests/test_plot.py
+++ b/xarray/tests/test_plot.py
@@ -3002,7 +3002,7 @@ def setUp(self) -> None:
data = np.sin(2 * np.pi * month / 12.0)
darray = DataArray(data, dims=["time"])
darray.coords["time"] = xr.cftime_range(
- start="2017", periods=12, freq="1M", calendar="noleap"
+ start="2017", periods=12, freq="1ME", calendar="noleap"
)
self.darray = darray

View File

@ -0,0 +1,118 @@
From 9406c49fb281d9ffbf88bfd46133288bd23649a4 Mon Sep 17 00:00:00 2001
From: Deepak Cherian <deepak@cherian.net>
Date: Tue, 6 Aug 2024 22:21:29 -0600
Subject: [PATCH 1/2] Fix some dask tests
---
xarray/tests/test_dask.py | 18 +++++++++++-------
1 file changed, 11 insertions(+), 7 deletions(-)
diff --git a/xarray/tests/test_dask.py b/xarray/tests/test_dask.py
index 20491eca91a..1ef759b3d6a 100644
--- a/xarray/tests/test_dask.py
+++ b/xarray/tests/test_dask.py
@@ -640,8 +640,10 @@ def counting_get(*args, **kwargs):
def test_duplicate_dims(self):
data = np.random.normal(size=(4, 4))
- arr = DataArray(data, dims=("x", "x"))
- chunked_array = arr.chunk({"x": 2})
+ with pytest.warns(UserWarning, match="Duplicate dimension"):
+ arr = DataArray(data, dims=("x", "x"))
+ with pytest.warns(UserWarning, match="Duplicate dimension"):
+ chunked_array = arr.chunk({"x": 2})
assert chunked_array.chunks == ((2, 2), (2, 2))
assert chunked_array.chunksizes == {"x": (2, 2)}
@@ -1364,7 +1366,8 @@ def test_map_blocks_ds_transformations(func, map_ds):
@pytest.mark.parametrize("obj", [make_da(), make_ds()])
def test_map_blocks_da_ds_with_template(obj):
func = lambda x: x.isel(x=[1])
- template = obj.isel(x=[1, 5, 9])
+ # a simple .isel(x=[1, 5, 9]) puts all those in a single chunk.
+ template = xr.concat([obj.isel(x=[i]) for i in [1, 5, 9]], dim="x")
with raise_if_dask_computes():
actual = xr.map_blocks(func, obj, template=template)
assert_identical(actual, template)
@@ -1395,15 +1398,16 @@ def test_map_blocks_roundtrip_string_index():
def test_map_blocks_template_convert_object():
da = make_da()
+ ds = da.to_dataset()
+
func = lambda x: x.to_dataset().isel(x=[1])
- template = da.to_dataset().isel(x=[1, 5, 9])
+ template = xr.concat([da.to_dataset().isel(x=[i]) for i in [1, 5, 9]], dim="x")
with raise_if_dask_computes():
actual = xr.map_blocks(func, da, template=template)
assert_identical(actual, template)
- ds = da.to_dataset()
func = lambda x: x.to_dataarray().isel(x=[1])
- template = ds.to_dataarray().isel(x=[1, 5, 9])
+ template = xr.concat([ds.to_dataarray().isel(x=[i]) for i in [1, 5, 9]], dim="x")
with raise_if_dask_computes():
actual = xr.map_blocks(func, ds, template=template)
assert_identical(actual, template)
@@ -1429,7 +1433,7 @@ def test_map_blocks_errors_bad_template(obj):
xr.map_blocks(
lambda a: a.isel(x=[1]).assign_coords(x=[120]), # assign bad index values
obj,
- template=obj.isel(x=[1, 5, 9]),
+ template=xr.concat([obj.isel(x=[i]) for i in [1, 5, 9]], dim="x"),
).compute()
From 6fa200e542fe18b99a86a53126c10639192ea5e1 Mon Sep 17 00:00:00 2001
From: Deepak Cherian <deepak@cherian.net>
Date: Tue, 6 Aug 2024 22:29:24 -0600
Subject: [PATCH 2/2] Cleanup
---
xarray/tests/test_variable.py | 11 +++++------
1 file changed, 5 insertions(+), 6 deletions(-)
diff --git a/xarray/tests/test_variable.py b/xarray/tests/test_variable.py
index 3f3f1756e45..ff6522c00eb 100644
--- a/xarray/tests/test_variable.py
+++ b/xarray/tests/test_variable.py
@@ -318,12 +318,11 @@ def test_datetime64_valid_range(self):
with pytest.raises(pderror, match=r"Out of bounds nanosecond"):
self.cls(["t"], [data])
- @pytest.mark.xfail(reason="pandas issue 36615")
@pytest.mark.filterwarnings("ignore:Converting non-nanosecond")
def test_timedelta64_valid_range(self):
data = np.timedelta64("200000", "D")
pderror = pd.errors.OutOfBoundsTimedelta
- with pytest.raises(pderror, match=r"Out of bounds nanosecond"):
+ with pytest.raises(pderror, match=r"Cannot convert"):
self.cls(["t"], [data])
def test_pandas_data(self):
@@ -2301,20 +2300,20 @@ def test_chunk(self):
assert blocked.chunks == ((3,), (3, 1))
assert blocked.data.name != first_dask_name
- @pytest.mark.xfail
+ @pytest.mark.skip
def test_0d_object_array_with_list(self):
super().test_0d_object_array_with_list()
- @pytest.mark.xfail
+ @pytest.mark.skip
def test_array_interface(self):
# dask array does not have `argsort`
super().test_array_interface()
- @pytest.mark.xfail
+ @pytest.mark.skip
def test_copy_index(self):
super().test_copy_index()
- @pytest.mark.xfail
+ @pytest.mark.skip
@pytest.mark.filterwarnings("ignore:elementwise comparison failed.*:FutureWarning")
def test_eq_all_dtypes(self):
super().test_eq_all_dtypes()

View File

@ -0,0 +1,98 @@
From 70e3f30d5a636f6d847acb2dd0d12cffeb601d41 Mon Sep 17 00:00:00 2001
From: Deepak Cherian <deepak@cherian.net>
Date: Tue, 13 Aug 2024 19:47:10 -0600
Subject: [PATCH 1/2] xfail np.cross tests
xref #9327
---
xarray/core/computation.py | 6 +++---
xarray/tests/test_computation.py | 12 ++++++++----
2 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/xarray/core/computation.py b/xarray/core/computation.py
index 5d21d0836b9..bb7122e82de 100644
--- a/xarray/core/computation.py
+++ b/xarray/core/computation.py
@@ -23,7 +23,7 @@
from xarray.core.merge import merge_attrs, merge_coordinates_without_align
from xarray.core.options import OPTIONS, _get_keep_attrs
from xarray.core.types import Dims, T_DataArray
-from xarray.core.utils import is_dict_like, is_duck_dask_array, is_scalar, parse_dims
+from xarray.core.utils import is_dict_like, is_scalar, parse_dims
from xarray.core.variable import Variable
from xarray.namedarray.parallelcompat import get_chunked_array_type
from xarray.namedarray.pycompat import is_chunked_array
@@ -1693,11 +1693,11 @@ def cross(
if a.sizes[dim] < b.sizes[dim]:
a = a.pad({dim: (0, 1)}, constant_values=0)
# TODO: Should pad or apply_ufunc handle correct chunking?
- a = a.chunk({dim: -1}) if is_duck_dask_array(a.data) else a
+ a = a.chunk({dim: -1}) if is_chunked_array(a.data) else a
else:
b = b.pad({dim: (0, 1)}, constant_values=0)
# TODO: Should pad or apply_ufunc handle correct chunking?
- b = b.chunk({dim: -1}) if is_duck_dask_array(b.data) else b
+ b = b.chunk({dim: -1}) if is_chunked_array(b.data) else b
else:
raise ValueError(
f"{dim!r} on {'a' if a.sizes[dim] == 1 else 'b'} is incompatible:"
diff --git a/xarray/tests/test_computation.py b/xarray/tests/test_computation.py
index 8b480b02472..e974b8b1ac8 100644
--- a/xarray/tests/test_computation.py
+++ b/xarray/tests/test_computation.py
@@ -2547,7 +2547,8 @@ def test_polyfit_polyval_integration(
"cartesian",
1,
],
- [ # Test 1 sized arrays with coords:
+ # Test 1 sized arrays with coords:
+ pytest.param(
xr.DataArray(
np.array([1]),
dims=["cartesian"],
@@ -2562,8 +2563,10 @@ def test_polyfit_polyval_integration(
np.array([4, 5, 6]),
"cartesian",
-1,
- ],
- [ # Test filling in between with coords:
+ marks=(pytest.mark.xfail(),),
+ ),
+ # Test filling in between with coords:
+ pytest.param(
xr.DataArray(
[1, 2],
dims=["cartesian"],
@@ -2578,7 +2581,8 @@ def test_polyfit_polyval_integration(
np.array([4, 5, 6]),
"cartesian",
-1,
- ],
+ marks=(pytest.mark.xfail(),),
+ ),
],
)
def test_cross(a, b, ae, be, dim: str, axis: int, use_dask: bool) -> None:
From deb9e3266ca163575b200960c14c87fc999dcfc6 Mon Sep 17 00:00:00 2001
From: Deepak Cherian <deepak@cherian.net>
Date: Tue, 13 Aug 2024 19:49:56 -0600
Subject: [PATCH 2/2] Force numpy>=2
---
ci/requirements/environment.yml | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/ci/requirements/environment.yml b/ci/requirements/environment.yml
index ef02a3e7f23..40ef4a7fc74 100644
--- a/ci/requirements/environment.yml
+++ b/ci/requirements/environment.yml
@@ -26,7 +26,7 @@ dependencies:
- numba
- numbagg
- numexpr
- - numpy
+ - numpy>=2
- opt_einsum
- packaging
- pandas

View File

@ -0,0 +1,44 @@
From 17367f3545a48d8b8a18bf8f7054b19351c255dc Mon Sep 17 00:00:00 2001
From: Justus Magin <keewis@posteo.de>
Date: Tue, 27 Aug 2024 15:18:32 +0200
Subject: [PATCH 1/3] also call `np.asarray` on numpy scalars
---
xarray/core/variable.py | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
Index: xarray-2024.07.0/xarray/core/variable.py
===================================================================
--- xarray-2024.07.0.orig/xarray/core/variable.py
+++ xarray-2024.07.0/xarray/core/variable.py
@@ -309,7 +309,7 @@ def as_compatible_data(
else:
data = np.asarray(data)
- if not isinstance(data, np.ndarray) and (
+ if not isinstance(data, np.ndarray | np.generic) and (
hasattr(data, "__array_function__") or hasattr(data, "__array_namespace__")
):
return cast("T_DuckArray", data)
Index: xarray-2024.07.0/xarray/tests/test_variable.py
===================================================================
--- xarray-2024.07.0.orig/xarray/tests/test_variable.py
+++ xarray-2024.07.0/xarray/tests/test_variable.py
@@ -2585,10 +2585,15 @@ class TestAsCompatibleData(Generic[T_Duc
assert source_ndarray(x) is source_ndarray(as_compatible_data(x))
def test_converted_types(self):
- for input_array in [[[0, 1, 2]], pd.DataFrame([[0, 1, 2]])]:
+ for input_array in [
+ [[0, 1, 2]],
+ pd.DataFrame([[0, 1, 2]]),
+ np.float64(1.4),
+ np.str_("abc"),
+ ]:
actual = as_compatible_data(input_array)
assert_array_equal(np.asarray(input_array), actual)
- assert np.ndarray == type(actual)
+ assert np.ndarray is type(actual)
assert np.asarray(input_array).dtype == actual.dtype
def test_masked_array(self):