1
0

9 Commits

Author SHA256 Message Date
31f59d6abe Accepting request 1328693 from devel:languages:python
OBS-URL: https://build.opensuse.org/request/show/1328693
OBS-URL: https://build.opensuse.org/package/show/openSUSE:Factory/python-urllib3_1?expand=0&rev=13
2026-01-26 09:42:59 +00:00
0c80991ea0 - Add security patches:
* CVE-2025-66471.patch (bsc#1254867)
  * CVE-2025-66418.patch (bsc#1254866)
  * CVE-2026-21441.patch (bsc#1256331)

OBS-URL: https://build.opensuse.org/package/show/devel:languages:python/python-urllib3_1?expand=0&rev=32
2026-01-22 13:56:43 +00:00
7418831001 Accepting request 1297619 from devel:languages:python
- Do not ignore deprecation warnings, the testsuite explicitly
  clears all warnings multiple times.
- Add patch filter-pyopenssl-deprecationwarning.patch:
  * Explicitly filter out new DeprecationWarnings raised by PyOpenSSL 25.1+
old: openSUSE:Factory/python-urllib3_1
new: devel:languages:python/python-urllib3_1 rev None
Index: python-urllib3_1.changes
===================================================================
--- python-urllib3_1.changes (revision 11)
+++ python-urllib3_1.changes (revision 30)
@@ -1,4 +1,17 @@
 -------------------------------------------------------------------
+Tue Aug  5 05:58:09 UTC 2025 - Steve Kowalik <steven.kowalik@suse.com>
+
+- Do not ignore deprecation warnings, the testsuite explicitly
+  clears all warnings multiple times.
+- Add patch filter-pyopenssl-deprecationwarning.patch:
+  * Explicitly filter out new DeprecationWarnings raised by PyOpenSSL 25.1+
+
+-------------------------------------------------------------------
+Thu Jul 17 20:28:07 UTC 2025 - Dirk Müller <dmueller@suse.com>
+
+- ignore deprecation warnings
+
+-------------------------------------------------------------------
 Wed Jun 25 05:18:37 UTC 2025 - Steve Kowalik <steven.kowalik@suse.com>
 
 - Add patch CVE-2025-50181-poolmanager-redirects.patch:
@@ -71,7 +84,7 @@
 -------------------------------------------------------------------
 Mon May 15 13:52:10 UTC 2023 - Dirk Müller <dmueller@suse.com>
 
-- rename to python-urllib3_1 
+- rename to python-urllib3_1
 
 -------------------------------------------------------------------
 Fri Apr 21 12:38:19 UTC 2023 - Dirk Müller <dmueller@suse.com>
@@ -198,7 +211,7 @@
 
 - update to 1.26.6
   * Deprecated the urllib3.contrib.ntlmpool module.
-  * Changed HTTPConnection.request_chunked() to not erroneously emit multiple 
+  * Changed HTTPConnection.request_chunked() to not erroneously emit multiple
     Transfer-Encoding headers in the case that one is already specified.
   * Fixed typo in deprecation message to recommend Retry.DEFAULT_ALLOWED_METHODS.
 
@@ -280,7 +293,7 @@
     ``Retry.DEFAULT_REMOVE_HEADERS_ON_REDIRECT``, and ``Retry(allowed_methods=...)``
     (Pull #2000) **Starting in urllib3 v2.0: Deprecated options will be removed**
   * Added default ``User-Agent`` header to every request (Pull #1750)
-  * Added ``urllib3.util.SKIP_HEADER`` for skipping ``User-Agent``, ``Accept-Encoding``, 
+  * Added ``urllib3.util.SKIP_HEADER`` for skipping ``User-Agent``, ``Accept-Encoding``,
     and ``Host`` headers from being automatically emitted with requests (Pull #2018)
   * Collapse ``transfer-encoding: chunked`` request data and framing into
     the same ``socket.send()`` call (Pull #1906)
@@ -573,7 +586,7 @@
 - add 1414.patch - fix tests with new tornado
 - refresh python-urllib3-recent-date.patch
 - drop urllib3-test-no-coverage.patch
- * Allow providing a list of headers to strip from requests when redirecting 
+ * Allow providing a list of headers to strip from requests when redirecting
    to a different host. Defaults to the Authorization header. Different
    headers can be set via Retry.remove_headers_on_redirect.
  * Fix util.selectors._fileobj_to_fd to accept long
@@ -921,9 +934,9 @@
   * pyopenssl: Support for TLSv1.1 and TLSv1.2. (Issue #696)
   * Close connections more defensively on exception. (Issue #734)
   * Adjusted read_chunked to handle gzipped, chunk-encoded bodies
-    without repeatedly flushing the decoder, to function better on 
+    without repeatedly flushing the decoder, to function better on
     Jython. (Issue #743)
-  * Accept ca_cert_dir for SSL-related PoolManager configuration. 
+  * Accept ca_cert_dir for SSL-related PoolManager configuration.
     (Issue #758)
 
 - removed ready-event.patch: applied upstream
@@ -963,12 +976,12 @@
 -------------------------------------------------------------------
 Tue Oct  6 15:03:05 UTC 2015 - hpj@urpla.net
 
-- add python-pyOpenSSL, python-certifi and python-pyasn1 requirements 
+- add python-pyOpenSSL, python-certifi and python-pyasn1 requirements
 
 -------------------------------------------------------------------
 Tue Oct  6 12:46:25 UTC 2015 - hpj@urpla.net
 
-- Comment out test requirements, as tests are disabled anyway, and 
+- Comment out test requirements, as tests are disabled anyway, and
   one of these packages depend on python-requests, which depends on
   this package resulting in a circular dependency for openSUSE <= 13.1
 
@@ -978,9 +991,9 @@
 - Update to version 1.12
   * Rely on six for importing httplib to work around conflicts with
     other Python 3 shims. (Issue #688)
-  * Add support for directories of certificate authorities, as 
+  * Add support for directories of certificate authorities, as
     supported by OpenSSL. (Issue #701)
-  * New exception: NewConnectionError, raised when we fail to 
+  * New exception: NewConnectionError, raised when we fail to
     establish a new connection, usually ECONNREFUSED socket error.
 - Fix version dependencies
 - Add new build requirements following upstream changes
@@ -988,7 +1001,7 @@
   * python-tox
   * python-twine
   * python-wheel
-- Update 0001-Don-t-pin-dependency-to-exact-version.patch 
+- Update 0001-Don-t-pin-dependency-to-exact-version.patch
 - Disable tests for now, as there require network
 
 -------------------------------------------------------------------
@@ -998,42 +1011,42 @@
 - Rebase 0001-Don-t-pin-dependency-to-exact-version.patch and
   urllib3-test-no-coverage.patch
 - Update to version 1.9 (2014-07-04)
-  * Shuffled around development-related files. 
-    If you're maintaining a distro package of urllib3, you may need 
+  * Shuffled around development-related files.
+    If you're maintaining a distro package of urllib3, you may need
     to tweak things. (Issue #415)
-  * Unverified HTTPS requests will trigger a warning on the first 
+  * Unverified HTTPS requests will trigger a warning on the first
     request. See our new security documentation for details.
     (Issue #426)
-  * New retry logic and urllib3.util.retry.Retry configuration 
+  * New retry logic and urllib3.util.retry.Retry configuration
     object. (Issue #326)
-  * All raised exceptions should now wrapped in a 
-    urllib3.exceptions.HTTPException-extending exception. 
+  * All raised exceptions should now wrapped in a
+    urllib3.exceptions.HTTPException-extending exception.
     (Issue #326)
   * All errors during a retry-enabled request should be wrapped in
-    urllib3.exceptions.MaxRetryError, including timeout-related 
-    exceptions which were previously exempt. Underlying error is 
+    urllib3.exceptions.MaxRetryError, including timeout-related
+    exceptions which were previously exempt. Underlying error is
     accessible from the .reason propery. (Issue #326)
-  * urllib3.exceptions.ConnectionError renamed to 
+  * urllib3.exceptions.ConnectionError renamed to
     urllib3.exceptions.ProtocolError. (Issue #326)
   * Errors during response read (such as IncompleteRead) are now
     wrapped in urllib3.exceptions.ProtocolError. (Issue #418)
-  * Requesting an empty host will raise 
+  * Requesting an empty host will raise
     urllib3.exceptions.LocationValueError. (Issue #417)
-  * Catch read timeouts over SSL connections as 
+  * Catch read timeouts over SSL connections as
     urllib3.exceptions.ReadTimeoutError. (Issue #419)
   * Apply socket arguments before connecting. (Issue #427)
 - Update to version 1.8.3 (2014-06-23)
-  * Fix TLS verification when using a proxy in Python 3.4.1. 
+  * Fix TLS verification when using a proxy in Python 3.4.1.
     (Issue #385)
-  * Add disable_cache option to urllib3.util.make_headers. 
+  * Add disable_cache option to urllib3.util.make_headers.
     (Issue #393)
-  * Wrap socket.timeout exception with 
+  * Wrap socket.timeout exception with
     urllib3.exceptions.ReadTimeoutError. (Issue #399)
-  * Fixed proxy-related bug where connections were being reused 
+  * Fixed proxy-related bug where connections were being reused
     incorrectly. (Issues #366, #369)
-  * Added socket_options keyword parameter which allows to define 
+  * Added socket_options keyword parameter which allows to define
     setsockopt configuration of new sockets. (Issue #397)
-  * Removed HTTPConnection.tcp_nodelay in favor of 
+  * Removed HTTPConnection.tcp_nodelay in favor of
     HTTPConnection.default_socket_options. (Issue #397)
   * Fixed TypeError bug in Python 2.6.4. (Issue #411)
 - Update to version 1.8.2 (2014-04-17)
@@ -1041,7 +1054,7 @@
 - Update to version 1.8.1 (2014-04-17)
   * Fix AppEngine bug of HTTPS requests going out as HTTP.
     (Issue #356)
-  * Don't install dummyserver into site-packages as it's only 
+  * Don't install dummyserver into site-packages as it's only
     needed for the test suite. (Issue #362)
   * Added support for specifying source_address. (Issue #352)
 
Index: python-urllib3_1.spec
===================================================================
--- python-urllib3_1.spec (revision 11)
+++ python-urllib3_1.spec (revision 30)
@@ -1,7 +1,7 @@
 #
 # spec file for package python-urllib3_1
 #
-# Copyright (c) 2025 SUSE LLC
+# Copyright (c) 2025 SUSE LLC and contributors
 #
 # All modifications and additions to the file contributed by third parties
 # remain the property of their copyright owners, unless otherwise agreed
@@ -37,6 +37,8 @@
 Patch0:         remove_mock.patch
 # PATCH-FIX-UPSTREAM CVE-2025-50181 gh#urllib3/urllib3@f05b1329126d, bsc#1244925
 Patch1:         CVE-2025-50181-poolmanager-redirects.patch
+# PATCH-FIX-OPENSUSE Explicitly ignore new DeprecationWarning from PyOpenSSL 25.1+
+Patch2:         filter-pyopenssl-deprecationwarning.patch
 BuildRequires:  %{python_module base >= 3.7}
 BuildRequires:  %{python_module pip}
 BuildRequires:  %{python_module setuptools}
Index: filter-pyopenssl-deprecationwarning.patch
===================================================================
--- filter-pyopenssl-deprecationwarning.patch (added)
+++ filter-pyopenssl-deprecationwarning.patch (revision 30)
@@ -0,0 +1,133 @@
+Index: urllib3-1.26.20/test/with_dummyserver/test_https.py
+===================================================================
+--- urllib3-1.26.20.orig/test/with_dummyserver/test_https.py
++++ urllib3-1.26.20/test/with_dummyserver/test_https.py
+@@ -215,6 +215,10 @@ class TestHTTPS(HTTPSDummyServerTestCase
+             assert conn.__class__ == VerifiedHTTPSConnection
+ 
+             with warnings.catch_warnings(record=True) as w:
++                # Filter PyOpenSSL 25.1+ DeprecationWarning
++                warnings.filterwarnings(
++                    "ignore", message="Attempting to mutate a Context after", category=DeprecationWarning
++                )
+                 r = https_pool.request("GET", "/")
+                 assert r.status == 200
+ 
+@@ -245,6 +249,13 @@ class TestHTTPS(HTTPSDummyServerTestCase
+                 r = https_pool.request("GET", "/")
+                 assert r.status == 200
+ 
++                # Filter PyOpenSSL 25.1+ DeprecationWarning
++                calls = warn.call_args_list
++                calls = [
++                    call for call in calls if call[0][1] != DeprecationWarning and
++                    not call[0][0].startswith("Attempting to mutate a Context")
++                ]
++
+                 # Modern versions of Python, or systems using PyOpenSSL, don't
+                 # emit warnings.
+                 if (
+@@ -252,7 +263,7 @@ class TestHTTPS(HTTPSDummyServerTestCase
+                     or util.IS_PYOPENSSL
+                     or util.IS_SECURETRANSPORT
+                 ):
+-                    assert not warn.called, warn.call_args_list
++                    assert not calls
+                 else:
+                     assert warn.called
+                     if util.HAS_SNI:
+@@ -274,6 +285,13 @@ class TestHTTPS(HTTPSDummyServerTestCase
+                 r = https_pool.request("GET", "/")
+                 assert r.status == 200
+ 
++                # Filter PyOpenSSL 25.1+ DeprecationWarning
++                calls = warn.call_args_list
++                calls = [
++                    call for call in calls if call[0][1] != DeprecationWarning and
++                    not call[0][0].startswith("Attempting to mutate a Context")
++                ]
++
+                 # Modern versions of Python, or systems using PyOpenSSL, don't
+                 # emit warnings.
+                 if (
+@@ -281,7 +299,7 @@ class TestHTTPS(HTTPSDummyServerTestCase
+                     or util.IS_PYOPENSSL
+                     or util.IS_SECURETRANSPORT
+                 ):
+-                    assert not warn.called, warn.call_args_list
++                    assert not calls
+                 else:
+                     assert warn.called
+                     if util.HAS_SNI:
+@@ -306,6 +324,10 @@ class TestHTTPS(HTTPSDummyServerTestCase
+             assert conn.__class__ == VerifiedHTTPSConnection
+ 
+             with warnings.catch_warnings(record=True) as w:
++                # Filter PyOpenSSL 25.1+ DeprecationWarning
++                warnings.filterwarnings(
++                    "ignore", message="Attempting to mutate a Context after", category=DeprecationWarning
++                )
+                 r = https_pool.request("GET", "/")
+                 assert r.status == 200
+ 
+@@ -412,6 +434,12 @@ class TestHTTPS(HTTPSDummyServerTestCase
+                 # warnings, which we want to ignore here.
+                 calls = warn.call_args_list
+ 
++                # Filter PyOpenSSL 25.1+ DeprecationWarning
++                calls = [
++                    call for call in calls if call[0][1] != DeprecationWarning and
++                    not call[0][0].startswith("Attempting to mutate a Context")
++                ]
++
+                 # If we're using a deprecated TLS version we can remove 'DeprecationWarning'
+                 if self.tls_protocol_deprecated():
+                     calls = [call for call in calls if call[0][1] != DeprecationWarning]
+@@ -687,6 +715,11 @@ class TestHTTPS(HTTPSDummyServerTestCase
+     def _request_without_resource_warnings(self, method, url):
+         with warnings.catch_warnings(record=True) as w:
+             warnings.simplefilter("always")
++            # Filter PyOpenSSL 25.1+ DeprecationWarning
++            warnings.filterwarnings(
++                "ignore", message="Attempting to mutate a Context after",
++                category=DeprecationWarning
++            )
+             with HTTPSConnectionPool(
+                 self.host, self.port, ca_certs=DEFAULT_CA
+             ) as https_pool:
+@@ -742,6 +775,11 @@ class TestHTTPS(HTTPSDummyServerTestCase
+             conn = https_pool._get_conn()
+             try:
+                 with warnings.catch_warnings(record=True) as w:
++                    # Filter PyOpenSSL 25.1+ DeprecationWarning
++                    warnings.filterwarnings(
++                        "ignore", message="Attempting to mutate a Context after",
++                        category=DeprecationWarning
++                    )
+                     conn.connect()
+                     if not hasattr(conn.sock, "version"):
+                         pytest.skip("SSLSocket.version() not available")
+@@ -769,6 +807,11 @@ class TestHTTPS(HTTPSDummyServerTestCase
+             conn = https_pool._get_conn()
+             try:
+                 with warnings.catch_warnings(record=True) as w:
++                    # Filter PyOpenSSL 25.1+ DeprecationWarning
++                    warnings.filterwarnings(
++                        "ignore", message="Attempting to mutate a Context after",
++                        category=DeprecationWarning
++                    )
+                     conn.connect()
+             finally:
+                 conn.close()
+@@ -788,6 +831,11 @@ class TestHTTPS(HTTPSDummyServerTestCase
+             conn = https_pool._get_conn()
+             try:
+                 with warnings.catch_warnings(record=True) as w:
++                    # Filter PyOpenSSL 25.1+ DeprecationWarning
++                    warnings.filterwarnings(
++                        "ignore", message="Attempting to mutate a Context after",
++                        category=DeprecationWarning
++                    )
+                     conn.connect()
+             finally:
+                 conn.close()

OBS-URL: https://build.opensuse.org/request/show/1297619
OBS-URL: https://build.opensuse.org/package/show/openSUSE:Factory/python-urllib3_1?expand=0&rev=12
2025-08-08 13:10:17 +00:00
6ff363237a - Do not ignore deprecation warnings, the testsuite explicitly
clears all warnings multiple times.
- Add patch filter-pyopenssl-deprecationwarning.patch:
  * Explicitly filter out new DeprecationWarnings raised by PyOpenSSL 25.1+

OBS-URL: https://build.opensuse.org/package/show/devel:languages:python/python-urllib3_1?expand=0&rev=30
2025-08-05 06:00:05 +00:00
9dad7f6f67 - ignore deprecation warnings
- rename to python-urllib3_1
  * Changed HTTPConnection.request_chunked() to not erroneously emit multiple
  * Added ``urllib3.util.SKIP_HEADER`` for skipping ``User-Agent``, ``Accept-Encoding``,
 * Allow providing a list of headers to strip from requests when redirecting
    without repeatedly flushing the decoder, to function better on
  * Accept ca_cert_dir for SSL-related PoolManager configuration.
- add python-pyOpenSSL, python-certifi and python-pyasn1 requirements
- Comment out test requirements, as tests are disabled anyway, and
  * Add support for directories of certificate authorities, as
  * New exception: NewConnectionError, raised when we fail to
- Update 0001-Don-t-pin-dependency-to-exact-version.patch
  * Shuffled around development-related files.
    If you're maintaining a distro package of urllib3, you may need
  * Unverified HTTPS requests will trigger a warning on the first
  * New retry logic and urllib3.util.retry.Retry configuration
  * All raised exceptions should now wrapped in a
    urllib3.exceptions.HTTPException-extending exception.
    urllib3.exceptions.MaxRetryError, including timeout-related
    exceptions which were previously exempt. Underlying error is
  * urllib3.exceptions.ConnectionError renamed to
  * Requesting an empty host will raise
  * Catch read timeouts over SSL connections as
  * Fix TLS verification when using a proxy in Python 3.4.1.
  * Add disable_cache option to urllib3.util.make_headers.
  * Wrap socket.timeout exception with
  * Fixed proxy-related bug where connections were being reused
  * Added socket_options keyword parameter which allows to define
  * Removed HTTPConnection.tcp_nodelay in favor of
  * Don't install dummyserver into site-packages as it's only

OBS-URL: https://build.opensuse.org/package/show/devel:languages:python/python-urllib3_1?expand=0&rev=29
2025-07-17 20:28:27 +00:00
770de961ef Accepting request 1288435 from devel:languages:python
- Add patch CVE-2025-50181-poolmanager-redirects.patch:
  * Pool managers now properly control redirects when retries is passed
    (CVE-2025-50181, GHSA-pq67-6m6q-mj2v, bsc#1244925)

OBS-URL: https://build.opensuse.org/request/show/1288435
OBS-URL: https://build.opensuse.org/package/show/openSUSE:Factory/python-urllib3_1?expand=0&rev=11
2025-06-27 21:00:42 +00:00
5182225611 - Add patch CVE-2025-50181-poolmanager-redirects.patch:
* Pool managers now properly control redirects when retries is passed
    (CVE-2025-50181, GHSA-pq67-6m6q-mj2v, bsc#1244925)

OBS-URL: https://build.opensuse.org/package/show/devel:languages:python/python-urllib3_1?expand=0&rev=27
2025-06-25 05:19:12 +00:00
43832bccee Accepting request 1278319 from devel:languages:python
- Skip some test that fails with latest python-tornado

OBS-URL: https://build.opensuse.org/request/show/1278319
OBS-URL: https://build.opensuse.org/package/show/openSUSE:Factory/python-urllib3_1?expand=0&rev=10
2025-05-23 12:27:27 +00:00
74743786b3 - Skip some test that fails with latest python-tornado
OBS-URL: https://build.opensuse.org/package/show/devel:languages:python/python-urllib3_1?expand=0&rev=25
2025-05-19 07:29:21 +00:00
7 changed files with 1567 additions and 32 deletions

View File

@@ -0,0 +1,230 @@
From f05b1329126d5be6de501f9d1e3e36738bc08857 Mon Sep 17 00:00:00 2001
From: Illia Volochii <illia.volochii@gmail.com>
Date: Wed, 18 Jun 2025 16:25:01 +0300
Subject: [PATCH] Merge commit from fork
* Apply Quentin's suggestion
Co-authored-by: Quentin Pradet <quentin.pradet@gmail.com>
* Add tests for disabled redirects in the pool manager
* Add a possible fix for the issue with not raised `MaxRetryError`
* Make urllib3 handle redirects instead of JS when JSPI is used
* Fix info in the new comment
* State that redirects with XHR are not controlled by urllib3
* Remove excessive params from new test requests
* Add tests reaching max non-0 redirects
* Test redirects with Emscripten
* Fix `test_merge_pool_kwargs`
* Add a changelog entry
* Parametrize tests
* Drop a fix for Emscripten
* Apply Seth's suggestion to docs
Co-authored-by: Seth Michael Larson <sethmichaellarson@gmail.com>
* Use a minor release instead of the patch one
---------
Co-authored-by: Quentin Pradet <quentin.pradet@gmail.com>
Co-authored-by: Seth Michael Larson <sethmichaellarson@gmail.com>
---
CHANGES.rst | 9 ++
docs/reference/contrib/emscripten.rst | 2 +-
dummyserver/app.py | 1 +
src/urllib3/poolmanager.py | 18 +++-
test/contrib/emscripten/test_emscripten.py | 16 ++++
test/test_poolmanager.py | 5 +-
test/with_dummyserver/test_poolmanager.py | 101 +++++++++++++++++++++
7 files changed, 148 insertions(+), 4 deletions(-)
Index: urllib3-1.26.20/src/urllib3/poolmanager.py
===================================================================
--- urllib3-1.26.20.orig/src/urllib3/poolmanager.py
+++ urllib3-1.26.20/src/urllib3/poolmanager.py
@@ -170,6 +170,22 @@ class PoolManager(RequestMethods):
def __init__(self, num_pools=10, headers=None, **connection_pool_kw):
RequestMethods.__init__(self, headers)
+ if "retries" in connection_pool_kw:
+ retries = connection_pool_kw["retries"]
+ if not isinstance(retries, Retry):
+ # When Retry is initialized, raise_on_redirect is based
+ # on a redirect boolean value.
+ # But requests made via a pool manager always set
+ # redirect to False, and raise_on_redirect always ends
+ # up being False consequently.
+ # Here we fix the issue by setting raise_on_redirect to
+ # a value needed by the pool manager without considering
+ # the redirect boolean.
+ raise_on_redirect = retries is not False
+ retries = Retry.from_int(retries, redirect=False)
+ retries.raise_on_redirect = raise_on_redirect
+ connection_pool_kw = connection_pool_kw.copy()
+ connection_pool_kw["retries"] = retries
self.connection_pool_kw = connection_pool_kw
self.pools = RecentlyUsedContainer(num_pools)
@@ -389,7 +405,7 @@ class PoolManager(RequestMethods):
kw["body"] = None
kw["headers"] = HTTPHeaderDict(kw["headers"])._prepare_for_method_change()
- retries = kw.get("retries")
+ retries = kw.get("retries", response.retries)
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect)
Index: urllib3-1.26.20/test/test_poolmanager.py
===================================================================
--- urllib3-1.26.20.orig/test/test_poolmanager.py
+++ urllib3-1.26.20/test/test_poolmanager.py
@@ -326,9 +326,10 @@ class TestPoolManager(object):
def test_merge_pool_kwargs(self):
"""Assert _merge_pool_kwargs works in the happy case"""
- p = PoolManager(strict=True)
+ retries = retry.Retry(total=100)
+ p = PoolManager(strict=True, retries=retries)
merged = p._merge_pool_kwargs({"new_key": "value"})
- assert {"strict": True, "new_key": "value"} == merged
+ assert {"retries": retries, "strict": True, "new_key": "value"} == merged
def test_merge_pool_kwargs_none(self):
"""Assert false-y values to _merge_pool_kwargs result in defaults"""
Index: urllib3-1.26.20/test/with_dummyserver/test_poolmanager.py
===================================================================
--- urllib3-1.26.20.orig/test/with_dummyserver/test_poolmanager.py
+++ urllib3-1.26.20/test/with_dummyserver/test_poolmanager.py
@@ -82,6 +82,94 @@ class TestPoolManager(HTTPDummyServerTes
assert r.status == 200
assert r.data == b"Dummy server!"
+ @pytest.mark.parametrize(
+ "retries",
+ (0, Retry(total=0), Retry(redirect=0), Retry(total=0, redirect=0)),
+ )
+ def test_redirects_disabled_for_pool_manager_with_0(
+ self, retries: typing.Literal[0] | Retry
+ ) -> None:
+ """
+ Check handling redirects when retries is set to 0 on the pool
+ manager.
+ """
+ with PoolManager(retries=retries) as http:
+ with pytest.raises(MaxRetryError):
+ http.request("GET", f"{self.base_url}/redirect")
+
+ # Setting redirect=True should not change the behavior.
+ with pytest.raises(MaxRetryError):
+ http.request("GET", f"{self.base_url}/redirect", redirect=True)
+
+ # Setting redirect=False should not make it follow the redirect,
+ # but MaxRetryError should not be raised.
+ response = http.request("GET", f"{self.base_url}/redirect", redirect=False)
+ assert response.status == 303
+
+ @pytest.mark.parametrize(
+ "retries",
+ (
+ False,
+ Retry(total=False),
+ Retry(redirect=False),
+ Retry(total=False, redirect=False),
+ ),
+ )
+ def test_redirects_disabled_for_pool_manager_with_false(
+ self, retries: typing.Literal[False] | Retry
+ ) -> None:
+ """
+ Check that setting retries set to False on the pool manager disables
+ raising MaxRetryError and redirect=True does not change the
+ behavior.
+ """
+ with PoolManager(retries=retries) as http:
+ response = http.request("GET", f"{self.base_url}/redirect")
+ assert response.status == 303
+
+ response = http.request("GET", f"{self.base_url}/redirect", redirect=True)
+ assert response.status == 303
+
+ response = http.request("GET", f"{self.base_url}/redirect", redirect=False)
+ assert response.status == 303
+
+ def test_redirects_disabled_for_individual_request(self) -> None:
+ """
+ Check handling redirects when they are meant to be disabled
+ on the request level.
+ """
+ with PoolManager() as http:
+ # Check when redirect is not passed.
+ with pytest.raises(MaxRetryError):
+ http.request("GET", f"{self.base_url}/redirect", retries=0)
+ response = http.request("GET", f"{self.base_url}/redirect", retries=False)
+ assert response.status == 303
+
+ # Check when redirect=True.
+ with pytest.raises(MaxRetryError):
+ http.request(
+ "GET", f"{self.base_url}/redirect", retries=0, redirect=True
+ )
+ response = http.request(
+ "GET", f"{self.base_url}/redirect", retries=False, redirect=True
+ )
+ assert response.status == 303
+
+ # Check when redirect=False.
+ response = http.request(
+ "GET", f"{self.base_url}/redirect", retries=0, redirect=False
+ )
+ assert response.status == 303
+ response = http.request(
+ "GET", f"{self.base_url}/redirect", retries=False, redirect=False
+ )
+ assert response.status == 303
+
+
+ def test_redirect_cross_host_remove_headers(self) -> None:
+ with PoolManager() as http:
+ r = http.request(
+
def test_cross_host_redirect(self):
with PoolManager() as http:
cross_host_location = "%s/echo?a=b" % self.base_url_alt
@@ -136,6 +224,24 @@ class TestPoolManager(HTTPDummyServerTes
pool = http.connection_from_host(self.host, self.port)
assert pool.num_connections == 1
+ # Check when retries are configured for the pool manager.
+ with PoolManager(retries=1) as http:
+ with pytest.raises(MaxRetryError):
+ http.request(
+ "GET",
+ f"{self.base_url}/redirect",
+ fields={"target": f"/redirect?target={self.base_url}/"},
+ )
+
+ # Here we allow more retries for the request.
+ response = http.request(
+ "GET",
+ f"{self.base_url}/redirect",
+ fields={"target": f"/redirect?target={self.base_url}/"},
+ retries=2,
+ )
+ assert response.status == 200
+
def test_redirect_cross_host_remove_headers(self):
with PoolManager() as http:
r = http.request(

70
CVE-2025-66418.patch Normal file
View File

@@ -0,0 +1,70 @@
From 24d7b67eac89f94e11003424bcf0d8f7b72222a8 Mon Sep 17 00:00:00 2001
From: Illia Volochii <illia.volochii@gmail.com>
Date: Fri, 5 Dec 2025 16:41:33 +0200
Subject: [PATCH] Merge commit from fork
* Add a hard-coded limit for the decompression chain
* Reuse new list
---
changelog/GHSA-gm62-xv2j-4w53.security.rst | 4 ++++
src/urllib3/response.py | 12 +++++++++++-
test/test_response.py | 10 ++++++++++
3 files changed, 25 insertions(+), 1 deletion(-)
create mode 100644 changelog/GHSA-gm62-xv2j-4w53.security.rst
Index: urllib3-1.26.20/changelog/GHSA-gm62-xv2j-4w53.security.rst
===================================================================
--- /dev/null
+++ urllib3-1.26.20/changelog/GHSA-gm62-xv2j-4w53.security.rst
@@ -0,0 +1,4 @@
+Fixed a security issue where an attacker could compose an HTTP response with
+virtually unlimited links in the ``Content-Encoding`` header, potentially
+leading to a denial of service (DoS) attack by exhausting system resources
+during decoding. The number of allowed chained encodings is now limited to 5.
Index: urllib3-1.26.20/src/urllib3/response.py
===================================================================
--- urllib3-1.26.20.orig/src/urllib3/response.py
+++ urllib3-1.26.20/src/urllib3/response.py
@@ -225,8 +225,18 @@ class MultiDecoder(object):
they were applied.
"""
- def __init__(self, modes):
- self._decoders = [_get_decoder(m.strip()) for m in modes.split(",")]
+ # Maximum allowed number of chained HTTP encodings in the
+ # Content-Encoding header.
+ max_decode_links = 5
+
+ def __init__(self, modes: str) -> None:
+ encodings = [m.strip() for m in modes.split(",")]
+ if len(encodings) > self.max_decode_links:
+ raise DecodeError(
+ "Too many content encodings in the chain: "
+ f"{len(encodings)} > {self.max_decode_links}"
+ )
+ self._decoders = [_get_decoder(e) for e in encodings]
def flush(self):
return self._decoders[0].flush()
Index: urllib3-1.26.20/test/test_response.py
===================================================================
--- urllib3-1.26.20.orig/test/test_response.py
+++ urllib3-1.26.20/test/test_response.py
@@ -477,6 +477,16 @@ class TestResponse(object):
assert r.data == b"foo"
+ def test_read_multi_decoding_too_many_links(self):
+ fp = BytesIO(b"foo")
+ with pytest.raises(
+ DecodeError, match="Too many content encodings in the chain: 6 > 5"
+ ):
+ HTTPResponse(
+ fp,
+ headers={"content-encoding": "gzip, deflate, br, zstd, gzip, deflate"},
+ )
+
def test_body_blob(self):
resp = HTTPResponse(b"foo")
assert resp.data == b"foo"

992
CVE-2025-66471.patch Normal file
View File

@@ -0,0 +1,992 @@
From c19571de34c47de3a766541b041637ba5f716ed7 Mon Sep 17 00:00:00 2001
From: Illia Volochii <illia.volochii@gmail.com>
Date: Fri, 5 Dec 2025 16:40:41 +0200
Subject: [PATCH] Merge commit from fork
* Prevent decompression bomb for zstd in Python 3.14
* Add experimental `decompress_iter` for Brotli
* Update changes for Brotli
* Add `GzipDecoder.decompress_iter`
* Test https://github.com/python-hyper/brotlicffi/pull/207
* Pin Brotli
* Add `decompress_iter` to all decoders and make tests pass
* Pin brotlicffi to an official release
* Revert changes to response.py
* Add `max_length` parameter to all `decompress` methods
* Fix the `test_brotlipy` session
* Unset `_data` on gzip error
* Add a test for memory usage
* Test more methods
* Fix the test for `stream`
* Cover more lines with tests
* Add more coverage
* Make `read1` a bit more efficient
* Fix PyPy tests for Brotli
* Revert an unnecessarily moved check
* Add some comments
* Leave just one `self._obj.decompress` call in `GzipDecoder`
* Refactor test params
* Test reads with all data already in the decompressor
* Prevent needless copying of data decoded with `max_length`
* Rename the changed test
* Note that responses of unknown length should be streamed too
* Add a changelog entry
* Avoid returning a memory view from `BytesQueueBuffer`
* Add one more note to the changelog entry
---
CHANGES.rst | 22 ++++
docs/advanced-usage.rst | 3 +-
docs/user-guide.rst | 4 +-
noxfile.py | 16 ++-
pyproject.toml | 5 +-
src/urllib3/response.py | 279 ++++++++++++++++++++++++++++++++++------
test/test_response.py | 269 +++++++++++++++++++++++++++++++++++++-
uv.lock | 177 +++++++++++--------------
8 files changed, 621 insertions(+), 154 deletions(-)
Index: urllib3-1.26.20/docs/advanced-usage.rst
===================================================================
--- urllib3-1.26.20.orig/docs/advanced-usage.rst
+++ urllib3-1.26.20/docs/advanced-usage.rst
@@ -57,7 +57,8 @@ When using ``preload_content=True`` (the
response body will be read immediately into memory and the HTTP connection
will be released back into the pool without manual intervention.
-However, when dealing with large responses it's often better to stream the response
+However, when dealing with responses of large or unknown length,
+it's often better to stream the response
content using ``preload_content=False``. Setting ``preload_content`` to ``False`` means
that urllib3 will only read from the socket when data is requested.
Index: urllib3-1.26.20/docs/user-guide.rst
===================================================================
--- urllib3-1.26.20.orig/docs/user-guide.rst
+++ urllib3-1.26.20/docs/user-guide.rst
@@ -99,8 +99,8 @@ to a byte string representing the respon
>>> r.data
b'\xaa\xa5H?\x95\xe9\x9b\x11'
-.. note:: For larger responses, it's sometimes better to :ref:`stream <stream>`
- the response.
+.. note:: For responses of large or unknown length, it's sometimes better to
+ :ref:`stream <stream>` the response.
Using io Wrappers with Response Content
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Index: urllib3-1.26.20/src/urllib3/response.py
===================================================================
--- urllib3-1.26.20.orig/src/urllib3/response.py
+++ urllib3-1.26.20/src/urllib3/response.py
@@ -1,5 +1,6 @@
from __future__ import absolute_import
+import collections
import io
import logging
import sys
@@ -23,6 +24,7 @@ from .connection import BaseSSLError, HT
from .exceptions import (
BodyNotHttplibCompatible,
DecodeError,
+ DependencyWarning,
HTTPError,
IncompleteRead,
InvalidChunkLength,
@@ -41,33 +43,60 @@ log = logging.getLogger(__name__)
class DeflateDecoder(object):
def __init__(self):
self._first_try = True
- self._data = b""
+ self._first_try_data = b""
+ self._unfed_data = b""
self._obj = zlib.decompressobj()
def __getattr__(self, name):
return getattr(self._obj, name)
- def decompress(self, data):
- if not data:
+ def decompress(self, data, max_length = -1):
+ data = self._unfed_data + data
+ self._unfed_data = b""
+ if not data and not self._obj.unconsumed_tail:
return data
+ original_max_length = max_length
+ if original_max_length < 0:
+ max_length = 0
+ elif original_max_length == 0:
+ # We should not pass 0 to the zlib decompressor because 0 is
+ # the default value that will make zlib decompress without a
+ # length limit.
+ # Data should be stored for subsequent calls.
+ self._unfed_data = data
+ return b""
+ # Subsequent calls always reuse `self._obj`. zlib requires
+ # passing the unconsumed tail if decompression is to continue
if not self._first_try:
- return self._obj.decompress(data)
+ return self._obj.decompress(
+ self._obj.unconsumed_tail + data, max_length=max_length
+ )
- self._data += data
+ # First call tries with RFC 1950 ZLIB format
+ self._first_try_data += data
try:
- decompressed = self._obj.decompress(data)
+ decompressed = self._obj.decompress(data, max_length=max_length)
if decompressed:
self._first_try = False
- self._data = None
+ self._first_try_data = b""
return decompressed
+ # On failure, it falls back to RFC 1951 DEFLATE format.
except zlib.error:
self._first_try = False
self._obj = zlib.decompressobj(-zlib.MAX_WBITS)
try:
- return self.decompress(self._data)
+ return self.decompress(
+ self._first_try_data, max_length=original_max_length
+ )
finally:
- self._data = None
+ self._first_try_data = b""
+
+ @property
+ def has_unconsumed_tail(self):
+ return bool(self._unfed_data) or (
+ bool(self._obj.unconsumed_tail) and not self._first_try
+ )
class GzipDecoderState(object):
@@ -81,30 +110,65 @@ class GzipDecoder(object):
def __init__(self):
self._obj = zlib.decompressobj(16 + zlib.MAX_WBITS)
self._state = GzipDecoderState.FIRST_MEMBER
+ self._unconsumed_tail = b""
def __getattr__(self, name):
return getattr(self._obj, name)
- def decompress(self, data):
+ def decompress(self, data, max_length = -1):
ret = bytearray()
- if self._state == GzipDecoderState.SWALLOW_DATA or not data:
+ if self._state == GzipDecoderState.SWALLOW_DATA:
return bytes(ret)
+
+ if max_length == 0:
+ # We should not pass 0 to the zlib decompressor because 0 is
+ # the default value that will make zlib decompress without a
+ # length limit.
+ # Data should be stored for subsequent calls.
+ self._unconsumed_tail += data
+ return b""
+
+ # zlib requires passing the unconsumed_tail to the subsequent
+ # call if the decompression is to continue.
+ data = self._unconsumed_tail + data
+ if not data and self._obj.eof:
+ return bytes(ret)
+
while True:
try:
- ret += self._obj.decompress(data)
+ ret += self._obj.decompress(
+ data, max_length=max(max_length - len(ret), 0)
+ )
except zlib.error:
previous_state = self._state
# Ignore data after the first error
self._state = GzipDecoderState.SWALLOW_DATA
+ self._unconsumed_tail = b""
if previous_state == GzipDecoderState.OTHER_MEMBERS:
# Allow trailing garbage acceptable in other gzip clients
return bytes(ret)
raise
+
+ self._unconsumed_tail = data = (
+ self._obj.unconsumed_tail or self._obj.unused_data
+ )
+ if max_length > 0 and len(ret) >= max_length:
+ break
+
data = self._obj.unused_data
if not data:
return bytes(ret)
- self._state = GzipDecoderState.OTHER_MEMBERS
- self._obj = zlib.decompressobj(16 + zlib.MAX_WBITS)
+ # When the end of a gzip member is reached, a new decompressor
+ # must be created for unused (possibly future) data.
+ if self._obj.eof:
+ self._state = GzipDecoderState.OTHER_MEMBERS
+ self._obj = zlib.decompressobj(16 + zlib.MAX_WBITS)
+
+ return bytes(ret)
+
+ @property
+ def has_unconsumed_tail(self):
+ return bool(self._unconsumed_tail)
if brotli is not None:
@@ -116,9 +180,35 @@ if brotli is not None:
def __init__(self):
self._obj = brotli.Decompressor()
if hasattr(self._obj, "decompress"):
- self.decompress = self._obj.decompress
+ setattr(self, "_decompress", self._obj.decompress)
else:
- self.decompress = self._obj.process
+ setattr(self, "_decompress", self._obj.process)
+
+ # Requires Brotli >= 1.2.0 for `output_buffer_limit`.
+ def _decompress(self, data, output_buffer_limit = -1):
+ raise NotImplementedError()
+
+ def decompress(self, data, max_length = -1):
+ try:
+ if max_length > 0:
+ return self._decompress(data, output_buffer_limit=max_length)
+ else:
+ return self._decompress(data)
+ except TypeError:
+ # Fallback for Brotli/brotlicffi/brotlipy versions without
+ # the `output_buffer_limit` parameter.
+ warnings.warn(
+ "Brotli >= 1.2.0 is required to prevent decomporession bombs.",
+ DependencyWarning,
+ )
+ return self._decompress(data)
+
+ @property
+ def has_unconsumed_tail(self):
+ try:
+ return not self._obj.can_accept_more_data()
+ except AttributeError:
+ return False
def flush(self):
if hasattr(self._obj, "flush"):
@@ -141,10 +231,30 @@ class MultiDecoder(object):
def flush(self):
return self._decoders[0].flush()
- def decompress(self, data):
- for d in reversed(self._decoders):
- data = d.decompress(data)
- return data
+ def decompress(self, data, max_length = -1):
+ if max_length <= 0:
+ for d in reversed(self._decoders):
+ data = d.decompress(data)
+ return data
+
+ ret = bytearray()
+
+ while True:
+ any_data = False
+ for d in reversed(self._decoders):
+ data = d.decompress(data, max_length=max_length - len(ret))
+ if data:
+ any_data = True
+ # We should not break when no data is returned because
+ # next decoders may produce data even with empty input.
+ ret += data
+ if not any_data or len(ret) >= max_length:
+ return bytes(ret)
+ data = b""
+
+ @property
+ def has_unconsumed_tail(self):
+ return any(d.has_unconsumed_tail for d in self._decoders)
def _get_decoder(mode):
@@ -160,6 +270,67 @@ def _get_decoder(mode):
return DeflateDecoder()
+class BytesQueueBuffer:
+ """Memory-efficient bytes buffer
+
+ To return decoded data in read() and still follow the BufferedIOBase API, we need a
+ buffer to always return the correct amount of bytes.
+
+ This buffer should be filled using calls to put()
+
+ Our maximum memory usage is determined by the sum of the size of:
+
+ * self.buffer, which contains the full data
+ * the largest chunk that we will copy in get()
+ """
+
+ def __init__(self):
+ self.buffer = collections.deque()
+ self._size: int = 0
+
+ def __len__(self):
+ return self._size
+
+ def put(self, data):
+ self.buffer.append(data)
+ self._size += len(data)
+
+ def get(self, n: int):
+ if n == 0:
+ return b""
+ elif not self.buffer:
+ raise RuntimeError("buffer is empty")
+ elif n < 0:
+ raise ValueError("n should be > 0")
+
+ if len(self.buffer[0]) == n and isinstance(self.buffer[0], bytes):
+ self._size -= n
+ return self.buffer.popleft()
+
+ fetched = 0
+ ret = io.BytesIO()
+ while fetched < n:
+ remaining = n - fetched
+ chunk = self.buffer.popleft()
+ chunk_length = len(chunk)
+ if remaining < chunk_length:
+ chunk = memoryview(chunk)
+ left_chunk, right_chunk = chunk[:remaining], chunk[remaining:]
+ ret.write(left_chunk)
+ self.buffer.appendleft(right_chunk)
+ self._size -= remaining
+ break
+ else:
+ ret.write(chunk)
+ self._size -= chunk_length
+ fetched += chunk_length
+
+ if not self.buffer:
+ break
+
+ return ret.getvalue()
+
+
class HTTPResponse(io.IOBase):
"""
HTTP Response container.
@@ -228,6 +399,7 @@ class HTTPResponse(io.IOBase):
self.reason = reason
self.strict = strict
self.decode_content = decode_content
+ self._has_decoded_content = False
self.retries = retries
self.enforce_content_length = enforce_content_length
self.auto_close = auto_close
@@ -261,6 +433,9 @@ class HTTPResponse(io.IOBase):
# Determine length of response
self.length_remaining = self._init_length(request_method)
+ # Used to return the correct amount of bytes for partial read()s
+ self._decoded_buffer = BytesQueueBuffer()
+
# If requested, preload the body.
if preload_content and not self._body:
self._body = self.read(decode_content=decode_content)
@@ -395,16 +570,19 @@ class HTTPResponse(io.IOBase):
if brotli is not None:
DECODER_ERROR_CLASSES += (brotli.error,)
- def _decode(self, data, decode_content, flush_decoder):
+ def _decode(self, data, decode_content, flush_decoder, max_length = None):
"""
Decode the data passed in and potentially flush the decoder.
"""
if not decode_content:
return data
+ if max_length is None or flush_decoder:
+ max_length = -1
+
try:
if self._decoder:
- data = self._decoder.decompress(data)
+ data = self._decoder.decompress(data, max_length=max_length)
except self.DECODER_ERROR_CLASSES as e:
content_encoding = self.headers.get("content-encoding", "").lower()
raise DecodeError(
@@ -532,6 +710,47 @@ class HTTPResponse(io.IOBase):
# StringIO doesn't like amt=None
return self._fp.read(amt) if amt is not None else self._fp.read()
+ def _raw_read(
+ self,
+ amt=None,
+ ):
+ """
+ Reads `amt` of bytes from the socket.
+ """
+ if self._fp is None:
+ return
+
+ fp_closed = getattr(self._fp, "closed", False)
+
+ with self._error_catcher():
+ data = self._fp_read(amt) if not fp_closed else b""
+ if amt is not None and amt != 0 and not data:
+ # Platform-specific: Buggy versions of Python.
+ # Close the connection when no data is returned
+ #
+ # This is redundant to what httplib/http.client _should_
+ # already do. However, versions of python released before
+ # December 15, 2012 (http://bugs.python.org/issue16298) do
+ # not properly close the connection in all cases. There is
+ # no harm in redundantly calling close.
+ self._fp.close()
+ if self.enforce_content_length and self.length_remaining not in (
+ 0,
+ None,
+ ):
+ # This is an edge case that httplib failed to cover due
+ # to concerns of backward compatibility. We're
+ # addressing it here to make sure IncompleteRead is
+ # raised during streaming, so all calls with incorrect
+ # Content-Length are caught.
+ raise IncompleteRead(self._fp_bytes_read, self.length_remaining)
+
+ if data:
+ self._fp_bytes_read += len(data)
+ if self.length_remaining is not None:
+ self.length_remaining -= len(data)
+ return data
+
def read(self, amt=None, decode_content=None, cache_content=False):
"""
Similar to :meth:`http.client.HTTPResponse.read`, but with two additional
@@ -557,53 +776,73 @@ class HTTPResponse(io.IOBase):
if decode_content is None:
decode_content = self.decode_content
- if self._fp is None:
- return
+ if amt and amt < 0:
+ # Negative numbers and `None` should be treated the same.
+ amt = None
+ elif amt is not None:
+ cache_content = False
+
+ if self._decoder and self._decoder.has_unconsumed_tail:
+ decoded_data = self._decode(
+ b"",
+ decode_content,
+ flush_decoder=False,
+ max_length=amt - len(self._decoded_buffer),
+ )
+ self._decoded_buffer.put(decoded_data)
+ if len(self._decoded_buffer) >= amt:
+ return self._decoded_buffer.get(amt)
- flush_decoder = False
- fp_closed = getattr(self._fp, "closed", False)
+ data = self._raw_read(amt)
- with self._error_catcher():
- data = self._fp_read(amt) if not fp_closed else b""
- if amt is None:
- flush_decoder = True
- else:
- cache_content = False
- if (
- amt != 0 and not data
- ): # Platform-specific: Buggy versions of Python.
- # Close the connection when no data is returned
- #
- # This is redundant to what httplib/http.client _should_
- # already do. However, versions of python released before
- # December 15, 2012 (http://bugs.python.org/issue16298) do
- # not properly close the connection in all cases. There is
- # no harm in redundantly calling close.
- self._fp.close()
- flush_decoder = True
- if self.enforce_content_length and self.length_remaining not in (
- 0,
- None,
- ):
- # This is an edge case that httplib failed to cover due
- # to concerns of backward compatibility. We're
- # addressing it here to make sure IncompleteRead is
- # raised during streaming, so all calls with incorrect
- # Content-Length are caught.
- raise IncompleteRead(self._fp_bytes_read, self.length_remaining)
+ flush_decoder = amt is None or (amt != 0 and not data)
- if data:
- self._fp_bytes_read += len(data)
- if self.length_remaining is not None:
- self.length_remaining -= len(data)
+ if (
+ not data
+ and len(self._decoded_buffer) == 0
+ and not (self._decoder and self._decoder.has_unconsumed_tail)
+ ):
+ return data
+ if amt is None:
data = self._decode(data, decode_content, flush_decoder)
-
if cache_content:
self._body = data
+ else:
+ # do not waste memory on buffer when not decoding
+ if not decode_content:
+ if self._has_decoded_content:
+ raise RuntimeError(
+ "Calling read(decode_contennt=False) is not supported after "
+ "read(decode_content=True) was called."
+ )
+ return data
+
+ decoded_data = self._decode(
+ data,
+ decode_content,
+ flush_decoder,
+ max_length=amt - len(self._decoded_buffer),
+ )
+ self._decoded_buffer.put(decoded_data)
+
+ while len(self._decoded_buffer) < amt and data:
+ # TODO make sure to initially read enough data to get past the headers
+ # For example, the GZ file header takes 10 bytes, we don't want to read
+ # it one byte at a time
+ data = self._raw_read(amt)
+ decoded_data = self._decode(
+ data,
+ decode_content,
+ flush_decoder,
+ max_length=amt - len(self._decoded_buffer),
+ )
+ self._decoded_buffer.put(decoded_data)
+ data = self._decoded_buffer.get(amt)
return data
+
def stream(self, amt=2 ** 16, decode_content=None):
"""
A generator wrapper for the read() method. A call will block until
@@ -624,7 +863,11 @@ class HTTPResponse(io.IOBase):
for line in self.read_chunked(amt, decode_content=decode_content):
yield line
else:
- while not is_fp_closed(self._fp):
+ while(
+ not is_fp_closed(self._fp)
+ or len(self._decoded_buffer) > 0
+ or (self._decoder and self._decoder.has_unconsumed_tail)
+ ):
data = self.read(amt=amt, decode_content=decode_content)
if data:
@@ -830,7 +1073,10 @@ class HTTPResponse(io.IOBase):
break
chunk = self._handle_chunk(amt)
decoded = self._decode(
- chunk, decode_content=decode_content, flush_decoder=False
+ chunk,
+ decode_content=decode_content,
+ flush_decoder=False,
+ max_length=amt,
)
if decoded:
yield decoded
Index: urllib3-1.26.20/setup.py
===================================================================
--- urllib3-1.26.20.orig/setup.py
+++ urllib3-1.26.20/setup.py
@@ -88,10 +88,10 @@ setup(
extras_require={
"brotli": [
# https://github.com/google/brotli/issues/1074
- "brotli==1.0.9; os_name != 'nt' and python_version < '3' and platform_python_implementation == 'CPython'",
- "brotli>=1.0.9; python_version >= '3' and platform_python_implementation == 'CPython'",
- "brotlicffi>=0.8.0; (os_name != 'nt' or python_version >= '3') and platform_python_implementation != 'CPython'",
- "brotlipy>=0.6.0; os_name == 'nt' and python_version < '3'",
+ "brotli==1.2.0; os_name != 'nt' and python_version < '3' and platform_python_implementation == 'CPython'",
+ "brotli>=1.2.0; python_version >= '3' and platform_python_implementation == 'CPython'",
+ "brotlicffi>=1.2.0; (os_name != 'nt' or python_version >= '3') and platform_python_implementation != 'CPython'",
+ "brotlipy>=1.2.0; os_name == 'nt' and python_version < '3'",
],
"secure": [
"pyOpenSSL>=0.14",
Index: urllib3-1.26.20/test/test_response.py
===================================================================
--- urllib3-1.26.20.orig/test/test_response.py
+++ urllib3-1.26.20/test/test_response.py
@@ -1,6 +1,7 @@
# -*- coding: utf-8 -*-
import contextlib
+import gzip
import re
import socket
import ssl
@@ -28,7 +29,7 @@ from urllib3.exceptions import (
httplib_IncompleteRead,
)
from urllib3.packages.six.moves import http_client as httplib
-from urllib3.response import HTTPResponse, brotli
+from urllib3.response import HTTPResponse, BytesQueueBuffer, brotli
from urllib3.util.response import is_fp_closed
from urllib3.util.retry import RequestHistory, Retry
@@ -56,6 +57,30 @@ def sock():
yield s
s.close()
+def deflate2_compress(data):
+ compressor = zlib.compressobj(6, zlib.DEFLATED, -zlib.MAX_WBITS)
+ return compressor.compress(data) + compressor.flush()
+
+
+if brotli:
+ try:
+ brotli.Decompressor().process(b"", output_buffer_limit=1024)
+ _brotli_gte_1_2_0_available = True
+ except (AttributeError, TypeError):
+ _brotli_gte_1_2_0_available = False
+else:
+ _brotli_gte_1_2_0_available = False
+
+
+class TestBytesQueueBuffer:
+ def test_memory_usage_single_chunk(
+ self
+ ):
+ buffer = BytesQueueBuffer()
+ chunk = bytes(10 * 2**20) # 10 MiB
+ buffer.put(chunk)
+ assert buffer.get(len(buffer)) is chunk
+
class TestLegacyResponse(object):
def test_getheaders(self):
@@ -146,12 +171,7 @@ class TestResponse(object):
fp, headers={"content-encoding": "deflate"}, preload_content=False
)
- assert r.read(3) == b""
- # Buffer in case we need to switch to the raw stream
- assert r._decoder._data is not None
assert r.read(1) == b"f"
- # Now that we've decoded data, we just stream through the decoder
- assert r._decoder._data is None
assert r.read(2) == b"oo"
assert r.read() == b""
assert r.read() == b""
@@ -166,10 +186,7 @@ class TestResponse(object):
fp, headers={"content-encoding": "deflate"}, preload_content=False
)
- assert r.read(1) == b""
assert r.read(1) == b"f"
- # Once we've decoded data, we just stream to the decoder; no buffering
- assert r._decoder._data is None
assert r.read(2) == b"oo"
assert r.read() == b""
assert r.read() == b""
@@ -184,7 +201,6 @@ class TestResponse(object):
fp, headers={"content-encoding": "gzip"}, preload_content=False
)
- assert r.read(11) == b""
assert r.read(1) == b"f"
assert r.read(2) == b"oo"
assert r.read() == b""
@@ -266,6 +282,157 @@ class TestResponse(object):
with pytest.raises(DecodeError):
HTTPResponse(fp, headers={"content-encoding": "br"})
+ _test_compressor_params = [
+ ("deflate1", ("deflate", zlib.compress)),
+ ("deflate2", ("deflate", deflate2_compress)),
+ ("gzip", ("gzip", gzip.compress)),
+ ]
+ if _brotli_gte_1_2_0_available:
+ _test_compressor_params.append(("brotli", ("br", brotli.compress)))
+ else:
+ _test_compressor_params.append(("brotli", None))
+
+ @pytest.mark.parametrize(
+ "data",
+ [d[1] for d in _test_compressor_params],
+ ids=[d[0] for d in _test_compressor_params],
+ )
+ def test_read_with_all_data_already_in_decompressor(
+ self,
+ request,
+ data,
+ ):
+ if data is None:
+ pytest.skip(f"Proper {request.node.callspec.id} decoder is not available")
+ original_data = b"bar" * 1000
+ name, compress_func = data
+ compressed_data = compress_func(original_data)
+ fp = mock.Mock(read=mock.Mock(return_value=b""))
+ r = HTTPResponse(fp, headers={"content-encoding": name}, preload_content=False)
+ # Put all data in the decompressor's buffer.
+ r._init_decoder()
+ assert r._decoder is not None # for mypy
+ decoded = r._decoder.decompress(compressed_data, max_length=0)
+ if name == "br":
+ # It's known that some Brotli libraries do not respect
+ # `max_length`.
+ r._decoded_buffer.put(decoded)
+ else:
+ assert decoded == b""
+ # Read the data via `HTTPResponse`.
+ read = getattr(r, "read")
+ assert read(0) == b""
+ assert read(2500) == original_data[:2500]
+ assert read(500) == original_data[2500:]
+ assert read(0) == b""
+ assert read() == b""
+
+ @pytest.mark.parametrize(
+ "delta",
+ (
+ 0, # First read from socket returns all compressed data.
+ -1, # First read from socket returns all but one byte of compressed data.
+ ),
+ )
+ @pytest.mark.parametrize(
+ "data",
+ [d[1] for d in _test_compressor_params],
+ ids=[d[0] for d in _test_compressor_params],
+ )
+ def test_decode_with_max_length_close_to_compressed_data_size(
+ self,
+ request,
+ delta,
+ data,
+ ):
+ """
+ Test decoding when the first read from the socket returns all or
+ almost all the compressed data, but then it has to be
+ decompressed in a couple of read calls.
+ """
+ if data is None:
+ pytest.skip(f"Proper {request.node.callspec.id} decoder is not available")
+
+ original_data = b"foo" * 1000
+ name, compress_func = data
+ compressed_data = compress_func(original_data)
+ fp = BytesIO(compressed_data)
+ r = HTTPResponse(fp, headers={"content-encoding": name}, preload_content=False)
+ initial_limit = len(compressed_data) + delta
+ read = getattr(r, "read")
+ initial_chunk = read(amt=initial_limit, decode_content=True)
+ assert len(initial_chunk) == initial_limit
+ assert (
+ len(read(amt=len(original_data), decode_content=True))
+ == len(original_data) - initial_limit
+ )
+
+ # Prepare 50 MB of compressed data outside of the test measuring
+ # memory usage.
+ _test_memory_usage_decode_with_max_length_params = [
+ (
+ params[0],
+ (params[1][0], params[1][1](b"A" * (50 * 2**20))) if params[1] else None,
+ )
+ for params in _test_compressor_params
+ ]
+
+ @pytest.mark.parametrize(
+ "data",
+ [d[1] for d in _test_memory_usage_decode_with_max_length_params],
+ ids=[d[0] for d in _test_memory_usage_decode_with_max_length_params],
+ )
+ @pytest.mark.parametrize("read_method", ("read", "read_chunked", "stream"))
+ # Decoders consume different amounts of memory during decompression.
+ # We set the 10 MB limit to ensure that the whole decompressed data
+ # is not stored unnecessarily.
+ #
+ # FYI, the following consumption was observed for the test with
+ # `read` on CPython 3.14.0:
+ # - deflate: 2.3 MiB
+ # - deflate2: 2.1 MiB
+ # - gzip: 2.1 MiB
+ # - brotli:
+ # - brotli v1.2.0: 9 MiB
+ # - brotlicffi v1.2.0.0: 6 MiB
+ # - brotlipy v0.7.0: 105.8 MiB
+ @pytest.mark.limit_memory("10 MB", current_thread_only=True)
+ def test_memory_usage_decode_with_max_length(
+ self,
+ request,
+ read_method,
+ data,
+ ):
+ if data is None:
+ pytest.skip(f"Proper {request.node.callspec.id} decoder is not available")
+
+ name, compressed_data = data
+ limit = 1024 * 1024 # 1 MiB
+ if read_method in ("read_chunked", "stream"):
+ httplib_r = httplib.HTTPResponse(MockSock) # type: ignore[arg-type]
+ httplib_r.fp = MockChunkedEncodingResponse([compressed_data]) # type: ignore[assignment]
+ r = HTTPResponse(
+ httplib_r,
+ preload_content=False,
+ headers={"transfer-encoding": "chunked", "content-encoding": name},
+ )
+ next(getattr(r, read_method)(amt=limit, decode_content=True))
+ else:
+ fp = BytesIO(compressed_data)
+ r = HTTPResponse(
+ fp, headers={"content-encoding": name}, preload_content=False
+ )
+ getattr(r, read_method)(amt=limit, decode_content=True)
+
+ # Check that the internal decoded buffer is empty unless brotli
+ # is used.
+ # Google's brotli library does not fully respect the output
+ # buffer limit: https://github.com/google/brotli/issues/1396
+ # And unmaintained brotlipy cannot limit the output buffer size.
+ if name != "br" or brotli.__name__ == "brotlicffi":
+ assert len(r._decoded_buffer) == 0
+
+
def test_multi_decoding_deflate_deflate(self):
data = zlib.compress(zlib.compress(b"foo"))
@@ -494,8 +661,8 @@ class TestResponse(object):
)
stream = resp.stream(2)
- assert next(stream) == b"f"
- assert next(stream) == b"oo"
+ assert next(stream) == b"fo"
+ assert next(stream) == b"o"
with pytest.raises(StopIteration):
next(stream)
@@ -524,6 +691,7 @@ class TestResponse(object):
# Ensure that ``tell()`` returns the correct number of bytes when
# part-way through streaming compressed content.
NUMBER_OF_READS = 10
+ PART_SIZE = 64
class MockCompressedDataReading(BytesIO):
"""
@@ -552,7 +720,7 @@ class TestResponse(object):
resp = HTTPResponse(
fp, headers={"content-encoding": "deflate"}, preload_content=False
)
- stream = resp.stream()
+ stream = resp.stream(PART_SIZE)
parts_positions = [(part, resp.tell()) for part in stream]
end_of_stream = resp.tell()
@@ -567,12 +735,28 @@ class TestResponse(object):
assert uncompressed_data == payload
# Check that the positions in the stream are correct
- expected = [(i + 1) * payload_part_size for i in range(NUMBER_OF_READS)]
- assert expected == list(positions)
+ # It is difficult to determine programatically what the positions
+ # returned by `tell` will be because the `HTTPResponse.read` method may
+ # call socket `read` a couple of times if it doesn't have enough data
+ # in the buffer or not call socket `read` at all if it has enough. All
+ # this depends on the message, how it was compressed, what is
+ # `PART_SIZE` and `payload_part_size`.
+ # So for simplicity the expected values are hardcoded.
+ expected = (92, 184, 230, 276, 322, 368, 414, 460)
+ assert expected == positions
# Check that the end of the stream is in the correct place
assert len(ZLIB_PAYLOAD) == end_of_stream
+ # Check that all parts have expected length
+ expected_last_part_size = len(uncompressed_data) % PART_SIZE
+ whole_parts = len(uncompressed_data) // PART_SIZE
+ if expected_last_part_size == 0:
+ expected_lengths = [PART_SIZE] * whole_parts
+ else:
+ expected_lengths = [PART_SIZE] * whole_parts + [expected_last_part_size]
+ assert expected_lengths == [len(part) for part in parts]
+
def test_deflate_streaming(self):
data = zlib.compress(b"foo")
@@ -582,8 +766,8 @@ class TestResponse(object):
)
stream = resp.stream(2)
- assert next(stream) == b"f"
- assert next(stream) == b"oo"
+ assert next(stream) == b"fo"
+ assert next(stream) == b"o"
with pytest.raises(StopIteration):
next(stream)
@@ -598,8 +782,8 @@ class TestResponse(object):
)
stream = resp.stream(2)
- assert next(stream) == b"f"
- assert next(stream) == b"oo"
+ assert next(stream) == b"fo"
+ assert next(stream) == b"o"
with pytest.raises(StopIteration):
next(stream)
Index: urllib3-1.26.20/test/with_dummyserver/test_socketlevel.py
===================================================================
--- urllib3-1.26.20.orig/test/with_dummyserver/test_socketlevel.py
+++ urllib3-1.26.20/test/with_dummyserver/test_socketlevel.py
@@ -1901,15 +1901,8 @@ class TestBadContentLength(SocketDummySe
"GET", url="/", preload_content=False, enforce_content_length=True
)
data = get_response.stream(100)
- # Read "good" data before we try to read again.
- # This won't trigger till generator is exhausted.
- next(data)
- try:
+ with pytest.raises(ProtocolError, match="12 bytes read, 10 more expected"):
next(data)
- assert False
- except ProtocolError as e:
- assert "12 bytes read, 10 more expected" in str(e)
-
done_event.set()
def test_enforce_content_length_no_body(self):

64
CVE-2026-21441.patch Normal file
View File

@@ -0,0 +1,64 @@
From 8864ac407bba8607950025e0979c4c69bc7abc7b Mon Sep 17 00:00:00 2001
From: Illia Volochii <illia.volochii@gmail.com>
Date: Wed, 7 Jan 2026 18:07:30 +0200
Subject: [PATCH] Merge commit from fork
* Stop decoding response content during redirects needlessly
* Rename the new query parameter
* Add a changelog entry
---
CHANGES.rst | 13 +++++++++++++
dummyserver/app.py | 8 +++++++-
src/urllib3/response.py | 6 +++++-
test/with_dummyserver/test_connectionpool.py | 19 +++++++++++++++++++
4 files changed, 44 insertions(+), 2 deletions(-)
Index: urllib3-1.26.20/src/urllib3/response.py
===================================================================
--- urllib3-1.26.20.orig/src/urllib3/response.py
+++ urllib3-1.26.20/src/urllib3/response.py
@@ -476,7 +476,11 @@ class HTTPResponse(io.IOBase):
Unread data in the HTTPResponse connection blocks the connection from being released back to the pool.
"""
try:
- self.read()
+ self.read(
+ # Do not spend resources decoding the content unless
+ # decoding has already been initialized.
+ decode_content=self._has_decoded_content,
+ )
except (HTTPError, SocketError, BaseSSLError, HTTPException):
pass
Index: urllib3-1.26.20/test/with_dummyserver/test_connectionpool.py
===================================================================
--- urllib3-1.26.20.orig/test/with_dummyserver/test_connectionpool.py
+++ urllib3-1.26.20/test/with_dummyserver/test_connectionpool.py
@@ -467,6 +467,25 @@ class TestConnectionPool(HTTPDummyServer
assert r.status == 200
assert r.data == b"Dummy server!"
+ @mock.patch("urllib3.response.GzipDecoder.decompress")
+ def test_no_decoding_with_redirect_when_preload_disabled(
+ self, gzip_decompress
+ ):
+ """
+ Test that urllib3 does not attempt to decode a gzipped redirect
+ response when `preload_content` is set to `False`.
+ """
+ with HTTPConnectionPool(self.host, self.port) as pool:
+ # Three requests are expected: two redirects and one final / 200 OK.
+ response = pool.request(
+ "GET",
+ "/redirect",
+ fields={"target": "/redirect?compressed=true", "compressed": "true"},
+ preload_content=False,
+ )
+ assert response.status == 200
+ gzip_decompress.assert_not_called()
+
def test_303_redirect_makes_request_lose_body(self):
with HTTPConnectionPool(self.host, self.port) as pool:
response = pool.request(

View File

@@ -0,0 +1,133 @@
Index: urllib3-1.26.20/test/with_dummyserver/test_https.py
===================================================================
--- urllib3-1.26.20.orig/test/with_dummyserver/test_https.py
+++ urllib3-1.26.20/test/with_dummyserver/test_https.py
@@ -215,6 +215,10 @@ class TestHTTPS(HTTPSDummyServerTestCase
assert conn.__class__ == VerifiedHTTPSConnection
with warnings.catch_warnings(record=True) as w:
+ # Filter PyOpenSSL 25.1+ DeprecationWarning
+ warnings.filterwarnings(
+ "ignore", message="Attempting to mutate a Context after", category=DeprecationWarning
+ )
r = https_pool.request("GET", "/")
assert r.status == 200
@@ -245,6 +249,13 @@ class TestHTTPS(HTTPSDummyServerTestCase
r = https_pool.request("GET", "/")
assert r.status == 200
+ # Filter PyOpenSSL 25.1+ DeprecationWarning
+ calls = warn.call_args_list
+ calls = [
+ call for call in calls if call[0][1] != DeprecationWarning and
+ not call[0][0].startswith("Attempting to mutate a Context")
+ ]
+
# Modern versions of Python, or systems using PyOpenSSL, don't
# emit warnings.
if (
@@ -252,7 +263,7 @@ class TestHTTPS(HTTPSDummyServerTestCase
or util.IS_PYOPENSSL
or util.IS_SECURETRANSPORT
):
- assert not warn.called, warn.call_args_list
+ assert not calls
else:
assert warn.called
if util.HAS_SNI:
@@ -274,6 +285,13 @@ class TestHTTPS(HTTPSDummyServerTestCase
r = https_pool.request("GET", "/")
assert r.status == 200
+ # Filter PyOpenSSL 25.1+ DeprecationWarning
+ calls = warn.call_args_list
+ calls = [
+ call for call in calls if call[0][1] != DeprecationWarning and
+ not call[0][0].startswith("Attempting to mutate a Context")
+ ]
+
# Modern versions of Python, or systems using PyOpenSSL, don't
# emit warnings.
if (
@@ -281,7 +299,7 @@ class TestHTTPS(HTTPSDummyServerTestCase
or util.IS_PYOPENSSL
or util.IS_SECURETRANSPORT
):
- assert not warn.called, warn.call_args_list
+ assert not calls
else:
assert warn.called
if util.HAS_SNI:
@@ -306,6 +324,10 @@ class TestHTTPS(HTTPSDummyServerTestCase
assert conn.__class__ == VerifiedHTTPSConnection
with warnings.catch_warnings(record=True) as w:
+ # Filter PyOpenSSL 25.1+ DeprecationWarning
+ warnings.filterwarnings(
+ "ignore", message="Attempting to mutate a Context after", category=DeprecationWarning
+ )
r = https_pool.request("GET", "/")
assert r.status == 200
@@ -412,6 +434,12 @@ class TestHTTPS(HTTPSDummyServerTestCase
# warnings, which we want to ignore here.
calls = warn.call_args_list
+ # Filter PyOpenSSL 25.1+ DeprecationWarning
+ calls = [
+ call for call in calls if call[0][1] != DeprecationWarning and
+ not call[0][0].startswith("Attempting to mutate a Context")
+ ]
+
# If we're using a deprecated TLS version we can remove 'DeprecationWarning'
if self.tls_protocol_deprecated():
calls = [call for call in calls if call[0][1] != DeprecationWarning]
@@ -687,6 +715,11 @@ class TestHTTPS(HTTPSDummyServerTestCase
def _request_without_resource_warnings(self, method, url):
with warnings.catch_warnings(record=True) as w:
warnings.simplefilter("always")
+ # Filter PyOpenSSL 25.1+ DeprecationWarning
+ warnings.filterwarnings(
+ "ignore", message="Attempting to mutate a Context after",
+ category=DeprecationWarning
+ )
with HTTPSConnectionPool(
self.host, self.port, ca_certs=DEFAULT_CA
) as https_pool:
@@ -742,6 +775,11 @@ class TestHTTPS(HTTPSDummyServerTestCase
conn = https_pool._get_conn()
try:
with warnings.catch_warnings(record=True) as w:
+ # Filter PyOpenSSL 25.1+ DeprecationWarning
+ warnings.filterwarnings(
+ "ignore", message="Attempting to mutate a Context after",
+ category=DeprecationWarning
+ )
conn.connect()
if not hasattr(conn.sock, "version"):
pytest.skip("SSLSocket.version() not available")
@@ -769,6 +807,11 @@ class TestHTTPS(HTTPSDummyServerTestCase
conn = https_pool._get_conn()
try:
with warnings.catch_warnings(record=True) as w:
+ # Filter PyOpenSSL 25.1+ DeprecationWarning
+ warnings.filterwarnings(
+ "ignore", message="Attempting to mutate a Context after",
+ category=DeprecationWarning
+ )
conn.connect()
finally:
conn.close()
@@ -788,6 +831,11 @@ class TestHTTPS(HTTPSDummyServerTestCase
conn = https_pool._get_conn()
try:
with warnings.catch_warnings(record=True) as w:
+ # Filter PyOpenSSL 25.1+ DeprecationWarning
+ warnings.filterwarnings(
+ "ignore", message="Attempting to mutate a Context after",
+ category=DeprecationWarning
+ )
conn.connect()
finally:
conn.close()

View File

@@ -1,3 +1,36 @@
-------------------------------------------------------------------
Wed Jan 21 16:44:35 UTC 2026 - Nico Krapp <nico.krapp@suse.com>
- Add security patches:
* CVE-2025-66471.patch (bsc#1254867)
* CVE-2025-66418.patch (bsc#1254866)
* CVE-2026-21441.patch (bsc#1256331)
-------------------------------------------------------------------
Tue Aug 5 05:58:09 UTC 2025 - Steve Kowalik <steven.kowalik@suse.com>
- Do not ignore deprecation warnings, the testsuite explicitly
clears all warnings multiple times.
- Add patch filter-pyopenssl-deprecationwarning.patch:
* Explicitly filter out new DeprecationWarnings raised by PyOpenSSL 25.1+
-------------------------------------------------------------------
Thu Jul 17 20:28:07 UTC 2025 - Dirk Müller <dmueller@suse.com>
- ignore deprecation warnings
-------------------------------------------------------------------
Wed Jun 25 05:18:37 UTC 2025 - Steve Kowalik <steven.kowalik@suse.com>
- Add patch CVE-2025-50181-poolmanager-redirects.patch:
* Pool managers now properly control redirects when retries is passed
(CVE-2025-50181, GHSA-pq67-6m6q-mj2v, bsc#1244925)
-------------------------------------------------------------------
Mon May 19 07:29:03 UTC 2025 - Daniel Garcia <daniel.garcia@suse.com>
- Skip some test that fails with latest python-tornado
-------------------------------------------------------------------
Tue Sep 10 06:30:59 UTC 2024 - Steve Kowalik <steven.kowalik@suse.com>
@@ -59,7 +92,7 @@ Mon May 22 11:23:33 UTC 2023 - Steve Kowalik <steven.kowalik@suse.com>
-------------------------------------------------------------------
Mon May 15 13:52:10 UTC 2023 - Dirk Müller <dmueller@suse.com>
- rename to python-urllib3_1
- rename to python-urllib3_1
-------------------------------------------------------------------
Fri Apr 21 12:38:19 UTC 2023 - Dirk Müller <dmueller@suse.com>
@@ -186,7 +219,7 @@ Tue Jul 13 10:53:07 UTC 2021 - Markéta Machová <mmachova@suse.com>
- update to 1.26.6
* Deprecated the urllib3.contrib.ntlmpool module.
* Changed HTTPConnection.request_chunked() to not erroneously emit multiple
* Changed HTTPConnection.request_chunked() to not erroneously emit multiple
Transfer-Encoding headers in the case that one is already specified.
* Fixed typo in deprecation message to recommend Retry.DEFAULT_ALLOWED_METHODS.
@@ -268,7 +301,7 @@ Thu Nov 26 09:02:30 UTC 2020 - Dirk Mueller <dmueller@suse.com>
``Retry.DEFAULT_REMOVE_HEADERS_ON_REDIRECT``, and ``Retry(allowed_methods=...)``
(Pull #2000) **Starting in urllib3 v2.0: Deprecated options will be removed**
* Added default ``User-Agent`` header to every request (Pull #1750)
* Added ``urllib3.util.SKIP_HEADER`` for skipping ``User-Agent``, ``Accept-Encoding``,
* Added ``urllib3.util.SKIP_HEADER`` for skipping ``User-Agent``, ``Accept-Encoding``,
and ``Host`` headers from being automatically emitted with requests (Pull #2018)
* Collapse ``transfer-encoding: chunked`` request data and framing into
the same ``socket.send()`` call (Pull #1906)
@@ -561,7 +594,7 @@ Sun Jul 15 22:30:26 UTC 2018 - mimi.vx@gmail.com
- add 1414.patch - fix tests with new tornado
- refresh python-urllib3-recent-date.patch
- drop urllib3-test-no-coverage.patch
* Allow providing a list of headers to strip from requests when redirecting
* Allow providing a list of headers to strip from requests when redirecting
to a different host. Defaults to the Authorization header. Different
headers can be set via Retry.remove_headers_on_redirect.
* Fix util.selectors._fileobj_to_fd to accept long
@@ -909,9 +942,9 @@ Tue Jan 5 14:40:22 UTC 2016 - hpj@urpla.net
* pyopenssl: Support for TLSv1.1 and TLSv1.2. (Issue #696)
* Close connections more defensively on exception. (Issue #734)
* Adjusted read_chunked to handle gzipped, chunk-encoded bodies
without repeatedly flushing the decoder, to function better on
without repeatedly flushing the decoder, to function better on
Jython. (Issue #743)
* Accept ca_cert_dir for SSL-related PoolManager configuration.
* Accept ca_cert_dir for SSL-related PoolManager configuration.
(Issue #758)
- removed ready-event.patch: applied upstream
@@ -951,12 +984,12 @@ Wed Oct 14 09:35:30 UTC 2015 - toddrme2178@gmail.com
-------------------------------------------------------------------
Tue Oct 6 15:03:05 UTC 2015 - hpj@urpla.net
- add python-pyOpenSSL, python-certifi and python-pyasn1 requirements
- add python-pyOpenSSL, python-certifi and python-pyasn1 requirements
-------------------------------------------------------------------
Tue Oct 6 12:46:25 UTC 2015 - hpj@urpla.net
- Comment out test requirements, as tests are disabled anyway, and
- Comment out test requirements, as tests are disabled anyway, and
one of these packages depend on python-requests, which depends on
this package resulting in a circular dependency for openSUSE <= 13.1
@@ -966,9 +999,9 @@ Fri Sep 25 11:24:49 UTC 2015 - p.drouand@gmail.com
- Update to version 1.12
* Rely on six for importing httplib to work around conflicts with
other Python 3 shims. (Issue #688)
* Add support for directories of certificate authorities, as
* Add support for directories of certificate authorities, as
supported by OpenSSL. (Issue #701)
* New exception: NewConnectionError, raised when we fail to
* New exception: NewConnectionError, raised when we fail to
establish a new connection, usually ECONNREFUSED socket error.
- Fix version dependencies
- Add new build requirements following upstream changes
@@ -976,7 +1009,7 @@ Fri Sep 25 11:24:49 UTC 2015 - p.drouand@gmail.com
* python-tox
* python-twine
* python-wheel
- Update 0001-Don-t-pin-dependency-to-exact-version.patch
- Update 0001-Don-t-pin-dependency-to-exact-version.patch
- Disable tests for now, as there require network
-------------------------------------------------------------------
@@ -986,42 +1019,42 @@ Thu Sep 11 12:38:13 UTC 2014 - toddrme2178@gmail.com
- Rebase 0001-Don-t-pin-dependency-to-exact-version.patch and
urllib3-test-no-coverage.patch
- Update to version 1.9 (2014-07-04)
* Shuffled around development-related files.
If you're maintaining a distro package of urllib3, you may need
* Shuffled around development-related files.
If you're maintaining a distro package of urllib3, you may need
to tweak things. (Issue #415)
* Unverified HTTPS requests will trigger a warning on the first
* Unverified HTTPS requests will trigger a warning on the first
request. See our new security documentation for details.
(Issue #426)
* New retry logic and urllib3.util.retry.Retry configuration
* New retry logic and urllib3.util.retry.Retry configuration
object. (Issue #326)
* All raised exceptions should now wrapped in a
urllib3.exceptions.HTTPException-extending exception.
* All raised exceptions should now wrapped in a
urllib3.exceptions.HTTPException-extending exception.
(Issue #326)
* All errors during a retry-enabled request should be wrapped in
urllib3.exceptions.MaxRetryError, including timeout-related
exceptions which were previously exempt. Underlying error is
urllib3.exceptions.MaxRetryError, including timeout-related
exceptions which were previously exempt. Underlying error is
accessible from the .reason propery. (Issue #326)
* urllib3.exceptions.ConnectionError renamed to
* urllib3.exceptions.ConnectionError renamed to
urllib3.exceptions.ProtocolError. (Issue #326)
* Errors during response read (such as IncompleteRead) are now
wrapped in urllib3.exceptions.ProtocolError. (Issue #418)
* Requesting an empty host will raise
* Requesting an empty host will raise
urllib3.exceptions.LocationValueError. (Issue #417)
* Catch read timeouts over SSL connections as
* Catch read timeouts over SSL connections as
urllib3.exceptions.ReadTimeoutError. (Issue #419)
* Apply socket arguments before connecting. (Issue #427)
- Update to version 1.8.3 (2014-06-23)
* Fix TLS verification when using a proxy in Python 3.4.1.
* Fix TLS verification when using a proxy in Python 3.4.1.
(Issue #385)
* Add disable_cache option to urllib3.util.make_headers.
* Add disable_cache option to urllib3.util.make_headers.
(Issue #393)
* Wrap socket.timeout exception with
* Wrap socket.timeout exception with
urllib3.exceptions.ReadTimeoutError. (Issue #399)
* Fixed proxy-related bug where connections were being reused
* Fixed proxy-related bug where connections were being reused
incorrectly. (Issues #366, #369)
* Added socket_options keyword parameter which allows to define
* Added socket_options keyword parameter which allows to define
setsockopt configuration of new sockets. (Issue #397)
* Removed HTTPConnection.tcp_nodelay in favor of
* Removed HTTPConnection.tcp_nodelay in favor of
HTTPConnection.default_socket_options. (Issue #397)
* Fixed TypeError bug in Python 2.6.4. (Issue #411)
- Update to version 1.8.2 (2014-04-17)
@@ -1029,7 +1062,7 @@ Thu Sep 11 12:38:13 UTC 2014 - toddrme2178@gmail.com
- Update to version 1.8.1 (2014-04-17)
* Fix AppEngine bug of HTTPS requests going out as HTTP.
(Issue #356)
* Don't install dummyserver into site-packages as it's only
* Don't install dummyserver into site-packages as it's only
needed for the test suite. (Issue #362)
* Added support for specifying source_address. (Issue #352)

View File

@@ -1,7 +1,7 @@
#
# spec file for package python-urllib3_1
#
# Copyright (c) 2024 SUSE LLC
# Copyright (c) 2026 SUSE LLC and contributors
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -35,6 +35,17 @@ Source: https://files.pythonhosted.org/packages/source/u/urllib3/urllib3
# PATCH-FIX-UPSTREAM remove_mock.patch gh#urllib3/urllib3#2108 mcepl@suse.com
# remove dependency on the external module mock
Patch0: remove_mock.patch
# PATCH-FIX-UPSTREAM CVE-2025-50181 gh#urllib3/urllib3@f05b1329126d, bsc#1244925
Patch1: CVE-2025-50181-poolmanager-redirects.patch
# PATCH-FIX-OPENSUSE Explicitly ignore new DeprecationWarning from PyOpenSSL 25.1+
Patch2: filter-pyopenssl-deprecationwarning.patch
# PATCH-FIX-UPSTREAM CVE-2025-66471.patch bsc#1254867 gh#urllib3/urllib3@c19571d
# and parts from gh#urllib3/urllib3@c35033f as prerequisite
Patch3: CVE-2025-66471.patch
# PATCH-FIX-UPSTREAM CVE-2025-66418.patch bsc#1254866 gh#urllib3/urllib3@24d7b67
Patch4: CVE-2025-66418.patch
# PATCH-FIX-UPSTREAM CVE-2026-21441.patch bsc#1256331 gh#urllib3/urllib3@8864ac4
Patch5: CVE-2026-21441.patch
BuildRequires: %{python_module base >= 3.7}
BuildRequires: %{python_module pip}
BuildRequires: %{python_module setuptools}
@@ -49,11 +60,11 @@ Requires: python-cryptography >= 1.3.4
Requires: python-idna >= 2.0.0
Requires: python-pyOpenSSL >= 0.14
Requires: python-six >= 1.12.0
Recommends: python-Brotli >= 1.0.9
Recommends: python-Brotli >= 1.2.0
Recommends: python-PySocks >= 1.5.6
BuildArch: noarch
%if %{with test}
BuildRequires: %{python_module Brotli >= 1.0.9}
BuildRequires: %{python_module Brotli >= 1.2.0}
BuildRequires: %{python_module PySocks >= 1.5.6}
BuildRequires: %{python_module dateutil}
BuildRequires: %{python_module flaky}
@@ -131,6 +142,8 @@ skiplist="test_ssl_read_timeout or test_ssl_failed_fingerprint_verification or t
skiplist+=" or test_recent_date"
# too slow to run in obs (checks 2GiB of data)
skiplist+=" or test_requesting_large_resources_via_ssl"
# Latest tornado raises an exception on bad header so this test fails
skiplist+=" or test_skip_header"
# Python 3.12: SSL requests to localhost hang during handshake
python312_skip=" or TestClientCerts or TestSSL or test_cannot_import_ssl or (TestProxyManager and test_connect)"
%pytest -k "not (${skiplist} ${$python_skip})" --no-success-flaky-report