33 Commits

Author SHA256 Message Date
e751161186 Revert to -rc52eb015706124355e431f06f3e97605 2025-08-21 13:41:50 +02:00
f37d3720b0 - Cherry-pick protobuf-fix-google-imports.patch to fix import issues of
reverse-dependency packages within the google namespace (bsc#1244918)

- Cherry-pick protobuf-fix-google-imports.patch to fix import issues of
  reverse-dependency packages within the google namespace (bsc#1244918)

- Cherry-pick protobuf-fix-google-imports.patch to fix import issues of
  reverse-dependency packages within the google namespace (bsc#1244918)

OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=208
2025-08-21 13:35:53 +02:00
17f3e8e786 Accepting request 1286692 from devel:tools:building
OBS-URL: https://build.opensuse.org/request/show/1286692
OBS-URL: https://build.opensuse.org/package/show/openSUSE:Factory/protobuf?expand=0&rev=87
2025-06-23 12:50:38 +00:00
d1376bdf61 - Update to 31.1
* Support allowing late injection of language feature set
    defaults from FeatureSet extensions while getting feature
    set extension values.
  * Support allowing late injection of language feature set
    defaults from FeatureSet extensions while getting feature
    set extension values.
  * Add missing copts attribute (#21982)
  * Support allowing late injection of language feature set
    defaults from FeatureSet extensions while getting feature
    set extension values.
  * Support allowing late injection of language feature set
    defaults from FeatureSet extensions while getting feature
    set extension values.
  * Python pyi print "import datetime" for Duration/Timestamp
    field
  * Add recursion depth limits to pure python (#bsc1244663, CVE-2025-4565)
  * Fix cmake staleness test
- from version 31.0
  * Loosen py_proto_library check to be on the import path instead
    of full directory (i.e. excluding external/module-name prefix).
  * Add support for import option for protoc.
  * Add notices.h with information about our dependencies' licenses
    and add --notices flag to protoc to print the contents of that file.
  * Move upb minitable code generator into protoc
  * Upgrade abseil-cpp to 20250127 and use @com_google_absl -> @abseil-cpp
    and com_google_googletest -> @googletest canonical BCR names.
  * Remove fast-path check for non-clang compilers in MessageCreator.
  * Add missing include.
  * Add weak attribute to GetClassData to speed up clang builds.

OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=206
2025-06-18 12:43:49 +00:00
e40d2b287a Accepting request 1280464 from devel:tools:building
OBS-URL: https://build.opensuse.org/request/show/1280464
OBS-URL: https://build.opensuse.org/package/show/openSUSE:Factory/protobuf?expand=0&rev=86
2025-06-04 18:27:22 +00:00
c78eea5c12 protobuf 30.2, needed for current abseil-cpp
OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=204
2025-05-27 07:07:54 +00:00
a1e0730f31 Accepting request 1274343 from devel:tools:building
OBS-URL: https://build.opensuse.org/request/show/1274343
OBS-URL: https://build.opensuse.org/package/show/openSUSE:Factory/protobuf?expand=0&rev=85
2025-05-06 14:38:50 +00:00
b9ad65081e - update to 29.3
* Fix cmake installation location of java and go features.
  * Add .bazeliskrc for protobuf repo to tell bazelisk to use 7.1.2 by default. 
  * Update artifact actions to v4
  * Added protobuf-java-util-removescope.patch to avoid Java compilation errors
    due to dependencies marked as runtime.

OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=202
2025-05-05 06:48:59 +00:00
55a603c196 Accepting request 1247793 from devel:tools:building
OBS-URL: https://build.opensuse.org/request/show/1247793
OBS-URL: https://build.opensuse.org/package/show/openSUSE:Factory/protobuf?expand=0&rev=84
2025-02-24 14:49:43 +00:00
07ddc22462 Accepting request 1247730 from home:bmwiedemann:branches:devel:tools:building
add missing references for SLE: (bsc#1230778, CVE-2024-7254)

OBS-URL: https://build.opensuse.org/request/show/1247730
OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=200
2025-02-22 07:31:26 +00:00
fcc27c5694 Accepting request 1219411 from devel:tools:building
OBS-URL: https://build.opensuse.org/request/show/1219411
OBS-URL: https://build.opensuse.org/package/show/openSUSE:Factory/protobuf?expand=0&rev=83
2024-10-30 16:33:47 +00:00
babb5cbb9f Accepting request 1219373 from home:fstrba:branches:devel:tools:building
fixup, please forward to factory

OBS-URL: https://build.opensuse.org/request/show/1219373
OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=198
2024-10-30 09:40:45 +00:00
66b8197c39 Accepting request 1218833 from home:jengelh:branches:devel:tools:building
- Add versionize-shlibs.patch, delete static-utf8-ranges.patch
  * Build the libutf8_range and libutf8_validity as shared library
    to conform to SLPP

OBS-URL: https://build.opensuse.org/request/show/1218833
OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=197
2024-10-29 17:15:02 +00:00
48df2da91b Accepting request 1218800 from home:fstrba:branches:devel:tools:building
fixup

OBS-URL: https://build.opensuse.org/request/show/1218800
OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=196
2024-10-28 12:09:00 +00:00
410cfe1f18 Accepting request 1218756 from home:fstrba:branches:devel:tools:building
revert some of the changes that would make upgrades a pain

OBS-URL: https://build.opensuse.org/request/show/1218756
OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=195
2024-10-28 10:24:05 +00:00
e2eb062b59 OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=194 2024-10-28 08:55:24 +00:00
e1ec4c72c2 - python: switch to pypi package to get the cythonized component
- drop python-protobuf-setup_py.patch (obsolete)

- python: switch to pypi package to get the cythonized component
- drop python-protobuf-setup_py.patch (obsolete)

OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=193
2024-10-28 08:21:47 +00:00
945481ece6 - python: switch to pypi package to get the cythonized component
- drop python-protobuf-setup_py.patch (obsolete)

OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=192
2024-10-28 08:21:33 +00:00
2adc256531 Accepting request 1218671 from home:fstrba:branches:devel:tools:building
Hopefully the last tiny installcheck fix, without changelog now, since it falls under the separate compiler package

OBS-URL: https://build.opensuse.org/request/show/1218671
OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=191
2024-10-28 08:08:43 +00:00
99290872d3 - update to 28.3:
* Fix packed reflection handling bug in edition 2023.
  * Mute the minor version warning
  * Populate Kotlin Manifest Files
  * Re-export includingDefaultValueFields in deprecated state for
    important Cloud customer. (https://github.com/protocolbuffers
    /protobuf/commit/3b62d78dc70d2b43af5998d427452246279363c7)
  * Cherrypick restoration of mutableCopy helpers (https://github
    .com/protocolbuffers/protobuf/commit/3ea568a9b6107ebf0d617c47
    6f53a31490fd3182)
  * Mute the minor version warning

- update to 28.3:
  * Fix packed reflection handling bug in edition 2023.
  * Mute the minor version warning
  * Populate Kotlin Manifest Files
  * Re-export includingDefaultValueFields in deprecated state for
    important Cloud customer. (https://github.com/protocolbuffers
    /protobuf/commit/3b62d78dc70d2b43af5998d427452246279363c7)
  * Cherrypick restoration of mutableCopy helpers (https://github
    .com/protocolbuffers/protobuf/commit/3ea568a9b6107ebf0d617c47
    6f53a31490fd3182)
  * Mute the minor version warning

OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=190
2024-10-25 15:24:54 +00:00
97e0db392d - update to 28.3:
* Fix packed reflection handling bug in edition 2023.
  * Mute the minor version warning
  * Populate Kotlin Manifest Files
  * Re-export includingDefaultValueFields in deprecated state for
    important Cloud customer. (https://github.com/protocolbuffers
    /protobuf/commit/3b62d78dc70d2b43af5998d427452246279363c7)
  * Cherrypick restoration of mutableCopy helpers (https://github
    .com/protocolbuffers/protobuf/commit/3ea568a9b6107ebf0d617c47
    6f53a31490fd3182)
  * Mute the minor version warning

OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=189
2024-10-25 15:24:43 +00:00
4a53e832d4 OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=188 2024-10-25 15:20:10 +00:00
683f61fd35 Accepting request 1218127 from home:fstrba:branches:devel:tools:building
Fix install-check failures by building helper libraries static + split protoc into separate subpackage

OBS-URL: https://build.opensuse.org/request/show/1218127
OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=187
2024-10-25 15:06:45 +00:00
bab4b394fd Accepting request 1217274 from home:fstrba:branches:devel:tools:building
little fixes around architecture

OBS-URL: https://build.opensuse.org/request/show/1217274
OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=186
2024-10-24 07:55:58 +00:00
1f32bb78b1 Accepting request 1217049 from home:fstrba:branches:devel:tools:building
Sync changes + split java into smaller packages + build the lite runtime for java too

OBS-URL: https://build.opensuse.org/request/show/1217049
OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=185
2024-10-23 08:14:56 +00:00
0b7cf4c016 - keep building for 15.4+
* Ruby C-Extension: Regen stale files

OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=184
2024-10-21 13:27:37 +00:00
9b7bbbdf47 Accepting request 1216706 from home:fstrba:branches:devel:tools:building
Fix build of the python-protobuf on different distributions + package maven artifact metadata for our protoc binary so that the automation of protobuf-maven-plugin finds it + try to simplify upgrades

OBS-URL: https://build.opensuse.org/request/show/1216706
OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=183
2024-10-21 13:25:32 +00:00
894747b45b Accepting request 1208150 from home:fstrba:branches:devel:tools:building
Split packages into separate _multibuild specs

OBS-URL: https://build.opensuse.org/request/show/1208150
OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=182
2024-10-18 08:41:16 +00:00
8c8573aa7f Accepting request 1206076 from system:homeautomation:home-assistant:unstable
- update to 28.2
  C++: Fix cord handling in DynamicMessage and oneofs
  Java: Add recursion check when parsing unknown fields
- python packages became arch dependend

OBS-URL: https://build.opensuse.org/request/show/1206076
OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=181
2024-10-12 20:47:40 +00:00
21c974f300 Accepting request 1193345 from devel:tools:building
OBS-URL: https://build.opensuse.org/request/show/1193345
OBS-URL: https://build.opensuse.org/package/show/openSUSE:Factory/protobuf?expand=0&rev=82
2024-08-15 07:57:20 +00:00
fb929e86af Accepting request 1193239 from home:AndreasStieger:branches:devel:microos
- tweak and correct how minimum version of abseil is specified
  (20230125 to 20230125.3)
- Remove explicit requirements of the protobuf-devel package, as
  the they are autogenerated when needed

OBS-URL: https://build.opensuse.org/request/show/1193239
OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=179
2024-08-12 07:03:53 +00:00
a8a8f0bf8a Accepting request 1191540 from devel:tools:building
OBS-URL: https://build.opensuse.org/request/show/1191540
OBS-URL: https://build.opensuse.org/package/show/openSUSE:Factory/protobuf?expand=0&rev=81
2024-08-07 04:09:56 +00:00
78efdebbe9 Accepting request 1191245 from home:AndreasStieger:branches:devel:tools:building
protobuf 25.4

OBS-URL: https://build.opensuse.org/request/show/1191245
OBS-URL: https://build.opensuse.org/package/show/devel:tools:building/protobuf?expand=0&rev=177
2024-08-04 14:12:04 +00:00
21 changed files with 83 additions and 3027 deletions

View File

@@ -1,256 +0,0 @@
From 1e7f83ea1b1945065ce1b89051cd655e4b8de22d Mon Sep 17 00:00:00 2001
From: Protobuf Team Bot <protobuf-github-bot@google.com>
Date: Tue, 13 May 2025 14:42:18 -0700
Subject: [PATCH 2/2] Add recursion depth limits to pure python
PiperOrigin-RevId: 758382549
---
python/google/protobuf/internal/decoder.py | 35 ++++++++++-
.../google/protobuf/internal/decoder_test.py | 14 +++++
.../google/protobuf/internal/message_test.py | 60 +++++++++++++++++--
.../protobuf/internal/self_recursive.proto | 1 +
4 files changed, 105 insertions(+), 5 deletions(-)
diff --git a/python/google/protobuf/internal/decoder.py b/python/google/protobuf/internal/decoder.py
index 89d829142..de0bc19a5 100755
--- a/python/google/protobuf/internal/decoder.py
+++ b/python/google/protobuf/internal/decoder.py
@@ -668,7 +668,13 @@ def GroupDecoder(field_number, is_repeated, is_packed, key, new_default):
if value is None:
value = field_dict.setdefault(key, new_default(message))
# Read sub-message.
+ current_depth += 1
+ if current_depth > _recursion_limit:
+ raise _DecodeError(
+ 'Error parsing message: too many levels of nesting.'
+ )
pos = value.add()._InternalParse(buffer, pos, end, current_depth)
+ current_depth -= 1
# Read end tag.
new_pos = pos+end_tag_len
if buffer[pos:new_pos] != end_tag_bytes or new_pos > end:
@@ -687,7 +693,11 @@ def GroupDecoder(field_number, is_repeated, is_packed, key, new_default):
if value is None:
value = field_dict.setdefault(key, new_default(message))
# Read sub-message.
+ current_depth += 1
+ if current_depth > _recursion_limit:
+ raise _DecodeError('Error parsing message: too many levels of nesting.')
pos = value._InternalParse(buffer, pos, end, current_depth)
+ current_depth -= 1
# Read end tag.
new_pos = pos+end_tag_len
if buffer[pos:new_pos] != end_tag_bytes or new_pos > end:
@@ -720,6 +730,11 @@ def MessageDecoder(field_number, is_repeated, is_packed, key, new_default):
if new_pos > end:
raise _DecodeError('Truncated message.')
# Read sub-message.
+ current_depth += 1
+ if current_depth > _recursion_limit:
+ raise _DecodeError(
+ 'Error parsing message: too many levels of nesting.'
+ )
if (
value.add()._InternalParse(buffer, pos, new_pos, current_depth)
!= new_pos
@@ -727,6 +742,7 @@ def MessageDecoder(field_number, is_repeated, is_packed, key, new_default):
# The only reason _InternalParse would return early is if it
# encountered an end-group tag.
raise _DecodeError('Unexpected end-group tag.')
+ current_depth -= 1
# Predict that the next tag is another copy of the same repeated field.
pos = new_pos + tag_len
if buffer[new_pos:pos] != tag_bytes or new_pos == end:
@@ -746,10 +762,14 @@ def MessageDecoder(field_number, is_repeated, is_packed, key, new_default):
if new_pos > end:
raise _DecodeError('Truncated message.')
# Read sub-message.
+ current_depth += 1
+ if current_depth > _recursion_limit:
+ raise _DecodeError('Error parsing message: too many levels of nesting.')
if value._InternalParse(buffer, pos, new_pos, current_depth) != new_pos:
# The only reason _InternalParse would return early is if it encountered
# an end-group tag.
raise _DecodeError('Unexpected end-group tag.')
+ current_depth -= 1
return new_pos
return DecodeField
@@ -984,6 +1004,15 @@ def _SkipGroup(buffer, pos, end):
pos = new_pos
+DEFAULT_RECURSION_LIMIT = 100
+_recursion_limit = DEFAULT_RECURSION_LIMIT
+
+
+def SetRecursionLimit(new_limit):
+ global _recursion_limit
+ _recursion_limit = new_limit
+
+
def _DecodeUnknownFieldSet(buffer, pos, end_pos=None, current_depth=0):
"""Decode UnknownFieldSet. Returns the UnknownFieldSet and new position."""
@@ -1017,7 +1046,11 @@ def _DecodeUnknownField(
data = buffer[pos:pos+size].tobytes()
pos += size
elif wire_type == wire_format.WIRETYPE_START_GROUP:
- (data, pos) = _DecodeUnknownFieldSet(buffer, pos, None, current_depth)
+ current_depth += 1
+ if current_depth >= _recursion_limit:
+ raise _DecodeError('Error parsing message: too many levels of nesting.')
+ data, pos = _DecodeUnknownFieldSet(buffer, pos, None, current_depth)
+ current_depth -= 1
elif wire_type == wire_format.WIRETYPE_END_GROUP:
return (0, -1)
else:
diff --git a/python/google/protobuf/internal/decoder_test.py b/python/google/protobuf/internal/decoder_test.py
index f801b6e76..11e6465b6 100644
--- a/python/google/protobuf/internal/decoder_test.py
+++ b/python/google/protobuf/internal/decoder_test.py
@@ -11,8 +11,10 @@
import io
import unittest
+from google.protobuf import message
from google.protobuf.internal import decoder
from google.protobuf.internal import testing_refleaks
+from google.protobuf.internal import wire_format
_INPUT_BYTES = b'\x84r\x12'
@@ -52,6 +54,18 @@ class DecoderTest(unittest.TestCase):
size = decoder._DecodeVarint(input_io)
self.assertEqual(size, None)
+ def test_decode_unknown_group_field_too_many_levels(self):
+ data = memoryview(b'\023' * 5_000_000)
+ self.assertRaisesRegex(
+ message.DecodeError,
+ 'Error parsing message',
+ decoder._DecodeUnknownField,
+ data,
+ 1,
+ wire_format.WIRETYPE_START_GROUP,
+ 1
+ )
+
if __name__ == '__main__':
unittest.main()
diff --git a/python/google/protobuf/internal/message_test.py b/python/google/protobuf/internal/message_test.py
index 48e6df806..6facb8135 100755
--- a/python/google/protobuf/internal/message_test.py
+++ b/python/google/protobuf/internal/message_test.py
@@ -36,6 +36,7 @@ from google.protobuf.internal import enum_type_wrapper
from google.protobuf.internal import more_extensions_pb2
from google.protobuf.internal import more_messages_pb2
from google.protobuf.internal import packed_field_test_pb2
+from google.protobuf.internal import self_recursive_pb2
from google.protobuf.internal import test_proto3_optional_pb2
from google.protobuf.internal import test_util
from google.protobuf.internal import testing_refleaks
@@ -1339,6 +1340,52 @@ class MessageTest(unittest.TestCase):
self.assertNotIn('oneof_string', m)
+@testing_refleaks.TestCase
+class TestRecursiveGroup(unittest.TestCase):
+
+ def _MakeRecursiveGroupMessage(self, n):
+ msg = self_recursive_pb2.SelfRecursive()
+ sub = msg
+ for _ in range(n):
+ sub = sub.sub_group
+ sub.i = 1
+ return msg.SerializeToString()
+
+ def testRecursiveGroups(self):
+ recurse_msg = self_recursive_pb2.SelfRecursive()
+ data = self._MakeRecursiveGroupMessage(100)
+ recurse_msg.ParseFromString(data)
+ self.assertTrue(recurse_msg.HasField('sub_group'))
+
+ def testRecursiveGroupsException(self):
+ if api_implementation.Type() != 'python':
+ api_implementation._c_module.SetAllowOversizeProtos(False)
+ recurse_msg = self_recursive_pb2.SelfRecursive()
+ data = self._MakeRecursiveGroupMessage(300)
+ with self.assertRaises(message.DecodeError) as context:
+ recurse_msg.ParseFromString(data)
+ self.assertIn('Error parsing message', str(context.exception))
+ if api_implementation.Type() == 'python':
+ self.assertIn('too many levels of nesting', str(context.exception))
+
+ def testRecursiveGroupsUnknownFields(self):
+ if api_implementation.Type() != 'python':
+ api_implementation._c_module.SetAllowOversizeProtos(False)
+ test_msg = unittest_pb2.TestAllTypes()
+ data = self._MakeRecursiveGroupMessage(300) # unknown to test_msg
+ with self.assertRaises(message.DecodeError) as context:
+ test_msg.ParseFromString(data)
+ self.assertIn(
+ 'Error parsing message',
+ str(context.exception),
+ )
+ if api_implementation.Type() == 'python':
+ self.assertIn('too many levels of nesting', str(context.exception))
+ decoder.SetRecursionLimit(310)
+ test_msg.ParseFromString(data)
+ decoder.SetRecursionLimit(decoder.DEFAULT_RECURSION_LIMIT)
+
+
# Class to test proto2-only features (required, extensions, etc.)
@testing_refleaks.TestCase
class Proto2Test(unittest.TestCase):
@@ -2722,8 +2769,6 @@ class PackedFieldTest(unittest.TestCase):
self.assertEqual(golden_data, message.SerializeToString())
-@unittest.skipIf(api_implementation.Type() == 'python',
- 'explicit tests of the C++ implementation')
@testing_refleaks.TestCase
class OversizeProtosTest(unittest.TestCase):
@@ -2740,16 +2785,23 @@ class OversizeProtosTest(unittest.TestCase):
msg.ParseFromString(self.GenerateNestedProto(100))
def testAssertOversizeProto(self):
- api_implementation._c_module.SetAllowOversizeProtos(False)
+ if api_implementation.Type() != 'python':
+ api_implementation._c_module.SetAllowOversizeProtos(False)
msg = unittest_pb2.TestRecursiveMessage()
with self.assertRaises(message.DecodeError) as context:
msg.ParseFromString(self.GenerateNestedProto(101))
self.assertIn('Error parsing message', str(context.exception))
def testSucceedOversizeProto(self):
- api_implementation._c_module.SetAllowOversizeProtos(True)
+
+ if api_implementation.Type() == 'python':
+ decoder.SetRecursionLimit(310)
+ else:
+ api_implementation._c_module.SetAllowOversizeProtos(True)
+
msg = unittest_pb2.TestRecursiveMessage()
msg.ParseFromString(self.GenerateNestedProto(101))
+ decoder.SetRecursionLimit(decoder.DEFAULT_RECURSION_LIMIT)
if __name__ == '__main__':
diff --git a/python/google/protobuf/internal/self_recursive.proto b/python/google/protobuf/internal/self_recursive.proto
index 20bc2b4d3..d2a7f004b 100644
--- a/python/google/protobuf/internal/self_recursive.proto
+++ b/python/google/protobuf/internal/self_recursive.proto
@@ -12,6 +12,7 @@ package google.protobuf.python.internal;
message SelfRecursive {
SelfRecursive sub = 1;
int32 i = 2;
+ SelfRecursive sub_group = 3 [features.message_encoding = DELIMITED];
}
message IndirectRecursive {
--
2.51.1

View File

@@ -1,58 +0,0 @@
From b8ada4c2a07449fe8c4c4574292a501c1350c6e6 Mon Sep 17 00:00:00 2001
From: aviralgarg05 <gargaviral99@gmail.com>
Date: Fri, 9 Jan 2026 20:59:10 +0530
Subject: [PATCH] Fix Any recursion depth bypass in Python
json_format.ParseDict
This fixes a security vulnerability where nested google.protobuf.Any messages
could bypass the max_recursion_depth limit, potentially leading to denial of
service via stack overflow.
The root cause was that _ConvertAnyMessage() was calling itself recursively
via methodcaller() for nested well-known types, bypassing the recursion depth
tracking in ConvertMessage().
The fix routes well-known type parsing through ConvertMessage() to ensure
proper recursion depth accounting for all message types including nested Any.
Fixes #25070
---
python/google/protobuf/json_format.py | 15 +++++++++------
1 file changed, 9 insertions(+), 6 deletions(-)
diff --git a/python/google/protobuf/json_format.py b/python/google/protobuf/json_format.py
index 2a6bba939..9ace6345e 100644
--- a/python/google/protobuf/json_format.py
+++ b/python/google/protobuf/json_format.py
@@ -521,6 +521,10 @@ class _Parser(object):
Raises:
ParseError: In case of convert problems.
"""
+ # Increment recursion depth at message entry. The max_recursion_depth limit
+ # is exclusive: a depth value equal to max_recursion_depth will trigger an
+ # error. For example, with max_recursion_depth=5, nesting up to depth 4 is
+ # allowed, but attempting depth 5 raises ParseError.
self.recursion_depth += 1
if self.recursion_depth > self.max_recursion_depth:
raise ParseError(
@@ -725,12 +729,11 @@ class _Parser(object):
value['value'], sub_message, '{0}.value'.format(path)
)
elif full_name in _WKTJSONMETHODS:
- methodcaller(
- _WKTJSONMETHODS[full_name][1],
- value['value'],
- sub_message,
- '{0}.value'.format(path),
- )(self)
+ # For well-known types (including nested Any), use ConvertMessage
+ # to ensure recursion depth is properly tracked
+ self.ConvertMessage(
+ value['value'], sub_message, '{0}.value'.format(path)
+ )
else:
del value['@type']
self._ConvertFieldValuePair(value, sub_message, path)
--
2.52.0

View File

@@ -1,5 +1,4 @@
<multibuild>
<package>python-protobuf</package>
<package>protobuf-java</package>
</multibuild>

View File

@@ -1,4 +1,4 @@
libprotobuf28_3_0
libprotoc28_3_0
libprotobuf-lite28_3_0
libutf8_range-28_3_0
libprotobuf29_3_0
libprotoc29_3_0
libprotobuf-lite29_3_0
libutf8_range-29_3_0

View File

@@ -1,421 +0,0 @@
From dac2e91e36408087d769be89a72fbafe1ea5039c Mon Sep 17 00:00:00 2001
From: Protobuf Team Bot <protobuf-github-bot@google.com>
Date: Tue, 4 Mar 2025 13:16:32 -0800
Subject: [PATCH 1/2] Internal pure python fixes
PiperOrigin-RevId: 733441339
---
python/google/protobuf/internal/decoder.py | 98 ++++++++++++++-----
.../google/protobuf/internal/message_test.py | 1 +
.../protobuf/internal/python_message.py | 7 +-
.../protobuf/internal/self_recursive.proto | 9 +-
4 files changed, 86 insertions(+), 29 deletions(-)
diff --git a/python/google/protobuf/internal/decoder.py b/python/google/protobuf/internal/decoder.py
index dcde1d942..89d829142 100755
--- a/python/google/protobuf/internal/decoder.py
+++ b/python/google/protobuf/internal/decoder.py
@@ -184,7 +184,10 @@ def _SimpleDecoder(wire_type, decode_value):
clear_if_default=False):
if is_packed:
local_DecodeVarint = _DecodeVarint
- def DecodePackedField(buffer, pos, end, message, field_dict):
+ def DecodePackedField(
+ buffer, pos, end, message, field_dict, current_depth=0
+ ):
+ del current_depth # unused
value = field_dict.get(key)
if value is None:
value = field_dict.setdefault(key, new_default(message))
@@ -199,11 +202,15 @@ def _SimpleDecoder(wire_type, decode_value):
del value[-1] # Discard corrupt value.
raise _DecodeError('Packed element was truncated.')
return pos
+
return DecodePackedField
elif is_repeated:
tag_bytes = encoder.TagBytes(field_number, wire_type)
tag_len = len(tag_bytes)
- def DecodeRepeatedField(buffer, pos, end, message, field_dict):
+ def DecodeRepeatedField(
+ buffer, pos, end, message, field_dict, current_depth=0
+ ):
+ del current_depth # unused
value = field_dict.get(key)
if value is None:
value = field_dict.setdefault(key, new_default(message))
@@ -218,9 +225,12 @@ def _SimpleDecoder(wire_type, decode_value):
if new_pos > end:
raise _DecodeError('Truncated message.')
return new_pos
+
return DecodeRepeatedField
else:
- def DecodeField(buffer, pos, end, message, field_dict):
+
+ def DecodeField(buffer, pos, end, message, field_dict, current_depth=0):
+ del current_depth # unused
(new_value, pos) = decode_value(buffer, pos)
if pos > end:
raise _DecodeError('Truncated message.')
@@ -229,6 +239,7 @@ def _SimpleDecoder(wire_type, decode_value):
else:
field_dict[key] = new_value
return pos
+
return DecodeField
return SpecificDecoder
@@ -364,7 +375,9 @@ def EnumDecoder(field_number, is_repeated, is_packed, key, new_default,
enum_type = key.enum_type
if is_packed:
local_DecodeVarint = _DecodeVarint
- def DecodePackedField(buffer, pos, end, message, field_dict):
+ def DecodePackedField(
+ buffer, pos, end, message, field_dict, current_depth=0
+ ):
"""Decode serialized packed enum to its value and a new position.
Args:
@@ -377,6 +390,7 @@ def EnumDecoder(field_number, is_repeated, is_packed, key, new_default,
Returns:
int, new position in serialized data.
"""
+ del current_depth # unused
value = field_dict.get(key)
if value is None:
value = field_dict.setdefault(key, new_default(message))
@@ -407,11 +421,14 @@ def EnumDecoder(field_number, is_repeated, is_packed, key, new_default,
# pylint: enable=protected-access
raise _DecodeError('Packed element was truncated.')
return pos
+
return DecodePackedField
elif is_repeated:
tag_bytes = encoder.TagBytes(field_number, wire_format.WIRETYPE_VARINT)
tag_len = len(tag_bytes)
- def DecodeRepeatedField(buffer, pos, end, message, field_dict):
+ def DecodeRepeatedField(
+ buffer, pos, end, message, field_dict, current_depth=0
+ ):
"""Decode serialized repeated enum to its value and a new position.
Args:
@@ -424,6 +441,7 @@ def EnumDecoder(field_number, is_repeated, is_packed, key, new_default,
Returns:
int, new position in serialized data.
"""
+ del current_depth # unused
value = field_dict.get(key)
if value is None:
value = field_dict.setdefault(key, new_default(message))
@@ -446,9 +464,11 @@ def EnumDecoder(field_number, is_repeated, is_packed, key, new_default,
if new_pos > end:
raise _DecodeError('Truncated message.')
return new_pos
+
return DecodeRepeatedField
else:
- def DecodeField(buffer, pos, end, message, field_dict):
+
+ def DecodeField(buffer, pos, end, message, field_dict, current_depth=0):
"""Decode serialized repeated enum to its value and a new position.
Args:
@@ -461,6 +481,7 @@ def EnumDecoder(field_number, is_repeated, is_packed, key, new_default,
Returns:
int, new position in serialized data.
"""
+ del current_depth # unused
value_start_pos = pos
(enum_value, pos) = _DecodeSignedVarint32(buffer, pos)
if pos > end:
@@ -480,6 +501,7 @@ def EnumDecoder(field_number, is_repeated, is_packed, key, new_default,
(tag_bytes, buffer[value_start_pos:pos].tobytes()))
# pylint: enable=protected-access
return pos
+
return DecodeField
@@ -538,7 +560,10 @@ def StringDecoder(field_number, is_repeated, is_packed, key, new_default,
tag_bytes = encoder.TagBytes(field_number,
wire_format.WIRETYPE_LENGTH_DELIMITED)
tag_len = len(tag_bytes)
- def DecodeRepeatedField(buffer, pos, end, message, field_dict):
+ def DecodeRepeatedField(
+ buffer, pos, end, message, field_dict, current_depth=0
+ ):
+ del current_depth # unused
value = field_dict.get(key)
if value is None:
value = field_dict.setdefault(key, new_default(message))
@@ -553,9 +578,12 @@ def StringDecoder(field_number, is_repeated, is_packed, key, new_default,
if buffer[new_pos:pos] != tag_bytes or new_pos == end:
# Prediction failed. Return.
return new_pos
+
return DecodeRepeatedField
else:
- def DecodeField(buffer, pos, end, message, field_dict):
+
+ def DecodeField(buffer, pos, end, message, field_dict, current_depth=0):
+ del current_depth # unused
(size, pos) = local_DecodeVarint(buffer, pos)
new_pos = pos + size
if new_pos > end:
@@ -565,6 +593,7 @@ def StringDecoder(field_number, is_repeated, is_packed, key, new_default,
else:
field_dict[key] = _ConvertToUnicode(buffer[pos:new_pos])
return new_pos
+
return DecodeField
@@ -579,7 +608,10 @@ def BytesDecoder(field_number, is_repeated, is_packed, key, new_default,
tag_bytes = encoder.TagBytes(field_number,
wire_format.WIRETYPE_LENGTH_DELIMITED)
tag_len = len(tag_bytes)
- def DecodeRepeatedField(buffer, pos, end, message, field_dict):
+ def DecodeRepeatedField(
+ buffer, pos, end, message, field_dict, current_depth=0
+ ):
+ del current_depth # unused
value = field_dict.get(key)
if value is None:
value = field_dict.setdefault(key, new_default(message))
@@ -594,9 +626,12 @@ def BytesDecoder(field_number, is_repeated, is_packed, key, new_default,
if buffer[new_pos:pos] != tag_bytes or new_pos == end:
# Prediction failed. Return.
return new_pos
+
return DecodeRepeatedField
else:
- def DecodeField(buffer, pos, end, message, field_dict):
+
+ def DecodeField(buffer, pos, end, message, field_dict, current_depth=0):
+ del current_depth # unused
(size, pos) = local_DecodeVarint(buffer, pos)
new_pos = pos + size
if new_pos > end:
@@ -606,6 +641,7 @@ def BytesDecoder(field_number, is_repeated, is_packed, key, new_default,
else:
field_dict[key] = buffer[pos:new_pos].tobytes()
return new_pos
+
return DecodeField
@@ -621,7 +657,9 @@ def GroupDecoder(field_number, is_repeated, is_packed, key, new_default):
tag_bytes = encoder.TagBytes(field_number,
wire_format.WIRETYPE_START_GROUP)
tag_len = len(tag_bytes)
- def DecodeRepeatedField(buffer, pos, end, message, field_dict):
+ def DecodeRepeatedField(
+ buffer, pos, end, message, field_dict, current_depth=0
+ ):
value = field_dict.get(key)
if value is None:
value = field_dict.setdefault(key, new_default(message))
@@ -630,7 +668,7 @@ def GroupDecoder(field_number, is_repeated, is_packed, key, new_default):
if value is None:
value = field_dict.setdefault(key, new_default(message))
# Read sub-message.
- pos = value.add()._InternalParse(buffer, pos, end)
+ pos = value.add()._InternalParse(buffer, pos, end, current_depth)
# Read end tag.
new_pos = pos+end_tag_len
if buffer[pos:new_pos] != end_tag_bytes or new_pos > end:
@@ -640,19 +678,22 @@ def GroupDecoder(field_number, is_repeated, is_packed, key, new_default):
if buffer[new_pos:pos] != tag_bytes or new_pos == end:
# Prediction failed. Return.
return new_pos
+
return DecodeRepeatedField
else:
- def DecodeField(buffer, pos, end, message, field_dict):
+
+ def DecodeField(buffer, pos, end, message, field_dict, current_depth=0):
value = field_dict.get(key)
if value is None:
value = field_dict.setdefault(key, new_default(message))
# Read sub-message.
- pos = value._InternalParse(buffer, pos, end)
+ pos = value._InternalParse(buffer, pos, end, current_depth)
# Read end tag.
new_pos = pos+end_tag_len
if buffer[pos:new_pos] != end_tag_bytes or new_pos > end:
raise _DecodeError('Missing group end tag.')
return new_pos
+
return DecodeField
@@ -666,7 +707,9 @@ def MessageDecoder(field_number, is_repeated, is_packed, key, new_default):
tag_bytes = encoder.TagBytes(field_number,
wire_format.WIRETYPE_LENGTH_DELIMITED)
tag_len = len(tag_bytes)
- def DecodeRepeatedField(buffer, pos, end, message, field_dict):
+ def DecodeRepeatedField(
+ buffer, pos, end, message, field_dict, current_depth=0
+ ):
value = field_dict.get(key)
if value is None:
value = field_dict.setdefault(key, new_default(message))
@@ -677,7 +720,10 @@ def MessageDecoder(field_number, is_repeated, is_packed, key, new_default):
if new_pos > end:
raise _DecodeError('Truncated message.')
# Read sub-message.
- if value.add()._InternalParse(buffer, pos, new_pos) != new_pos:
+ if (
+ value.add()._InternalParse(buffer, pos, new_pos, current_depth)
+ != new_pos
+ ):
# The only reason _InternalParse would return early is if it
# encountered an end-group tag.
raise _DecodeError('Unexpected end-group tag.')
@@ -686,9 +732,11 @@ def MessageDecoder(field_number, is_repeated, is_packed, key, new_default):
if buffer[new_pos:pos] != tag_bytes or new_pos == end:
# Prediction failed. Return.
return new_pos
+
return DecodeRepeatedField
else:
- def DecodeField(buffer, pos, end, message, field_dict):
+
+ def DecodeField(buffer, pos, end, message, field_dict, current_depth=0):
value = field_dict.get(key)
if value is None:
value = field_dict.setdefault(key, new_default(message))
@@ -698,11 +746,12 @@ def MessageDecoder(field_number, is_repeated, is_packed, key, new_default):
if new_pos > end:
raise _DecodeError('Truncated message.')
# Read sub-message.
- if value._InternalParse(buffer, pos, new_pos) != new_pos:
+ if value._InternalParse(buffer, pos, new_pos, current_depth) != new_pos:
# The only reason _InternalParse would return early is if it encountered
# an end-group tag.
raise _DecodeError('Unexpected end-group tag.')
return new_pos
+
return DecodeField
@@ -851,7 +900,8 @@ def MapDecoder(field_descriptor, new_default, is_message_map):
# Can't read _concrete_class yet; might not be initialized.
message_type = field_descriptor.message_type
- def DecodeMap(buffer, pos, end, message, field_dict):
+ def DecodeMap(buffer, pos, end, message, field_dict, current_depth=0):
+ del current_depth # Unused.
submsg = message_type._concrete_class()
value = field_dict.get(key)
if value is None:
@@ -934,7 +984,7 @@ def _SkipGroup(buffer, pos, end):
pos = new_pos
-def _DecodeUnknownFieldSet(buffer, pos, end_pos=None):
+def _DecodeUnknownFieldSet(buffer, pos, end_pos=None, current_depth=0):
"""Decode UnknownFieldSet. Returns the UnknownFieldSet and new position."""
unknown_field_set = containers.UnknownFieldSet()
@@ -944,14 +994,16 @@ def _DecodeUnknownFieldSet(buffer, pos, end_pos=None):
field_number, wire_type = wire_format.UnpackTag(tag)
if wire_type == wire_format.WIRETYPE_END_GROUP:
break
- (data, pos) = _DecodeUnknownField(buffer, pos, wire_type)
+ (data, pos) = _DecodeUnknownField(buffer, pos, wire_type, current_depth)
# pylint: disable=protected-access
unknown_field_set._add(field_number, wire_type, data)
return (unknown_field_set, pos)
-def _DecodeUnknownField(buffer, pos, wire_type):
+def _DecodeUnknownField(
+ buffer, pos, wire_type, current_depth=0
+):
"""Decode a unknown field. Returns the UnknownField and new position."""
if wire_type == wire_format.WIRETYPE_VARINT:
@@ -965,7 +1017,7 @@ def _DecodeUnknownField(buffer, pos, wire_type):
data = buffer[pos:pos+size].tobytes()
pos += size
elif wire_type == wire_format.WIRETYPE_START_GROUP:
- (data, pos) = _DecodeUnknownFieldSet(buffer, pos)
+ (data, pos) = _DecodeUnknownFieldSet(buffer, pos, None, current_depth)
elif wire_type == wire_format.WIRETYPE_END_GROUP:
return (0, -1)
else:
diff --git a/python/google/protobuf/internal/message_test.py b/python/google/protobuf/internal/message_test.py
index 2a723eabb..48e6df806 100755
--- a/python/google/protobuf/internal/message_test.py
+++ b/python/google/protobuf/internal/message_test.py
@@ -30,6 +30,7 @@ import warnings
cmp = lambda x, y: (x > y) - (x < y)
from google.protobuf.internal import api_implementation # pylint: disable=g-import-not-at-top
+from google.protobuf.internal import decoder
from google.protobuf.internal import encoder
from google.protobuf.internal import enum_type_wrapper
from google.protobuf.internal import more_extensions_pb2
diff --git a/python/google/protobuf/internal/python_message.py b/python/google/protobuf/internal/python_message.py
index fabc6aa07..62c059cd2 100755
--- a/python/google/protobuf/internal/python_message.py
+++ b/python/google/protobuf/internal/python_message.py
@@ -1194,7 +1194,7 @@ def _AddMergeFromStringMethod(message_descriptor, cls):
fields_by_tag = cls._fields_by_tag
message_set_decoders_by_tag = cls._message_set_decoders_by_tag
- def InternalParse(self, buffer, pos, end):
+ def InternalParse(self, buffer, pos, end, current_depth=0):
"""Create a message from serialized bytes.
Args:
@@ -1244,10 +1244,13 @@ def _AddMergeFromStringMethod(message_descriptor, cls):
else:
_MaybeAddDecoder(cls, field_des)
field_decoder = field_des._decoders[is_packed]
- pos = field_decoder(buffer, new_pos, end, self, field_dict)
+ pos = field_decoder(
+ buffer, new_pos, end, self, field_dict, current_depth
+ )
if field_des.containing_oneof:
self._UpdateOneofState(field_des)
return pos
+
cls._InternalParse = InternalParse
diff --git a/python/google/protobuf/internal/self_recursive.proto b/python/google/protobuf/internal/self_recursive.proto
index dbfcaf971..20bc2b4d3 100644
--- a/python/google/protobuf/internal/self_recursive.proto
+++ b/python/google/protobuf/internal/self_recursive.proto
@@ -5,18 +5,19 @@
// license that can be found in the LICENSE file or at
// https://developers.google.com/open-source/licenses/bsd
-syntax = "proto2";
+edition = "2023";
package google.protobuf.python.internal;
message SelfRecursive {
- optional SelfRecursive sub = 1;
+ SelfRecursive sub = 1;
+ int32 i = 2;
}
message IndirectRecursive {
- optional IntermediateRecursive intermediate = 1;
+ IntermediateRecursive intermediate = 1;
}
message IntermediateRecursive {
- optional IndirectRecursive indirect = 1;
+ IndirectRecursive indirect = 1;
}
--
2.51.1

Binary file not shown.

BIN
protobuf-29.3.tar.gz LFS Normal file

Binary file not shown.

Binary file not shown.

BIN
protobuf-5.29.3.tar.gz LFS Normal file

Binary file not shown.

View File

@@ -1,45 +0,0 @@
From 8351926380c7cc91aae6df5695c91426e209f958 Mon Sep 17 00:00:00 2001
From: Ge Yunxi <141423244+gyx47@users.noreply.github.com>
Date: Fri, 11 Jul 2025 11:04:58 -0700
Subject: [PATCH] drop-deprecated-pkg-resources-declare (#22442)
# Description
As of setuptools 81, pkg_resources.declare_namespace has been marked as deprecated (scheduled to be removed after 2025-11-30) so I remove it from init.py
# Environment:
a virtual machine of arch riscv64
# procedure
I got this problem when running a test that applied this package.
```
src/certbot_dns_google/_internal/tests/dns_google_test.py:9: in <module>
from google.auth import exceptions as googleauth_exceptions
/usr/lib/python3.13/site-packages/google/__init__.py:2: in <module>
__import__('pkg_resources').declare_namespace(__name__)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
/usr/lib/python3.13/site-packages/pkg_resources/__init__.py:98: in <module>
warnings.warn(
E UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81.
```
[certbot-dns-google-4.1.1-1-riscv64-check.log](https://github.com/user-attachments/files/20976539/certbot-dns-google-4.1.1-1-riscv64-check.log)
Closes #22442
COPYBARA_INTEGRATE_REVIEW=https://github.com/protocolbuffers/protobuf/pull/22442 from gyx47:patch-1 6aef5c9df150cce444910d224fe90b2a514c7868
PiperOrigin-RevId: 782041935
---
python/google/__init__.py | 7 +++----
1 file changed, 3 insertions(+), 4 deletions(-)
diff --git a/python/google/__init__.py b/python/google/__init__.py
index 5585614122997..b36383a61027f 100644
--- a/python/google/__init__.py
+++ b/python/google/__init__.py
@@ -1,4 +1,3 @@
-try:
- __import__('pkg_resources').declare_namespace(__name__)
-except ImportError:
- __path__ = __import__('pkgutil').extend_path(__path__, __name__)
+from pkgutil import extend_path
+
+__path__ = extend_path(__path__, __name__)

View File

@@ -4,7 +4,7 @@
<parent>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-parent</artifactId>
<version>4.28.3</version>
<version>4.29.3</version>
</parent>
<artifactId>protobuf-java</artifactId>

View File

@@ -4,7 +4,7 @@
<parent>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-parent</artifactId>
<version>4.28.3</version>
<version>4.29.3</version>
</parent>
<artifactId>protobuf-java-util</artifactId>
@@ -16,32 +16,37 @@
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>4.28.3</version>
<version>4.29.3</version>
</dependency>
<dependency>
<groupId>com.google.code.findbugs</groupId>
<artifactId>jsr305</artifactId>
<version>3.0.2</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.9</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.google.errorprone</groupId>
<artifactId>error_prone_annotations</artifactId>
<version>2.18.0</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>32.0.1-jre</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.google.j2objc</groupId>
<artifactId>j2objc-annotations</artifactId>
<version>2.8</version>
<scope>runtime</scope>
</dependency>
</dependencies>

View File

@@ -0,0 +1,34 @@
--- pom.xml 2025-05-02 23:04:51.224332863 +0200
+++ pom.xml 2025-05-02 23:05:30.728959217 +0200
@@ -22,31 +22,26 @@
<groupId>com.google.code.findbugs</groupId>
<artifactId>jsr305</artifactId>
<version>3.0.2</version>
- <scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.9</version>
- <scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.google.errorprone</groupId>
<artifactId>error_prone_annotations</artifactId>
<version>2.18.0</version>
- <scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>32.0.1-jre</version>
- <scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.google.j2objc</groupId>
<artifactId>j2objc-annotations</artifactId>
<version>2.8</version>
- <scope>runtime</scope>
</dependency>
</dependencies>

View File

@@ -1,27 +1,12 @@
-------------------------------------------------------------------
Tue Jan 27 08:29:42 UTC 2026 - John Paul Adrian Glaubitz <adrian.glaubitz@suse.com>
Thu May 1 09:05:08 UTC 2025 - Antonello Tartamo <antonello.tartamo@suse.com>
- Delete deprecated google/__init__.py namespace file
-------------------------------------------------------------------
Mon Jan 26 13:00:51 UTC 2026 - John Paul Adrian Glaubitz <adrian.glaubitz@suse.com>
- Add CVE-2026-0994.patch to fix google.protobuf.Any recursion depth
bypass in Python json_format.ParseDict (bsc#1257173, CVE-2026-0994)
-------------------------------------------------------------------
Mon Jan 5 12:14:24 UTC 2026 - John Paul Adrian Glaubitz <adrian.glaubitz@suse.com>
- Cherry-pick protobuf-fix-google-imports.patch to fix import issues of
reverse-dependency packages within the google namespace (bsc#1244918)
-------------------------------------------------------------------
Fri Nov 14 14:32:06 UTC 2025 - John Paul Adrian Glaubitz <adrian.glaubitz@suse.com>
- Add internal-pure-python-fixes.patch to backport changes required for CVE fix
- Add CVE-2025-4565.patch to fix parsing of untrusted Protocol Buffers
data containing an arbitrary number of recursive groups or messages
can lead to crash due to RecursionError (bsc#1244663, CVE-2025-4565)
- update to 29.3
* Fix cmake installation location of java and go features.
* Add .bazeliskrc for protobuf repo to tell bazelisk to use 7.1.2 by default.
* Update artifact actions to v4
* Added protobuf-java-util-removescope.patch to avoid Java compilation errors
due to dependencies marked as runtime.
-------------------------------------------------------------------
Mon Oct 28 08:20:17 UTC 2024 - Dirk Müller <dmueller@suse.com>

View File

@@ -1,7 +1,7 @@
#
# spec file for package protobuf-java
#
# Copyright (c) 2026 SUSE LLC and contributors
# Copyright (c) 2025 SUSE LLC
# Copyright (c) 2024 Andreas Stieger <Andreas.Stieger@gmx.de>
#
# All modifications and additions to the file contributed by third parties
@@ -18,8 +18,9 @@
%define tarname protobuf
%define patchjuname protobuf-java-util-removescope.patch
Name: protobuf-java
Version: 28.3
Version: 29.3
Release: 0
Summary: Java Bindings for Google Protocol Buffers
License: BSD-3-Clause
@@ -29,6 +30,7 @@ Source0: https://github.com/protocolbuffers/protobuf/releases/download/v%
Source1: https://repo1.maven.org/maven2/com/google/protobuf/%{name}/4.%{version}/%{name}-4.%{version}.pom
Source2: https://repo1.maven.org/maven2/com/google/protobuf/%{name}lite/4.%{version}/%{name}lite-4.%{version}.pom
Source3: https://repo1.maven.org/maven2/com/google/protobuf/%{name}-util/4.%{version}/%{name}-util-4.%{version}.pom
Source4: %{patchjuname}
BuildRequires: fdupes
BuildRequires: java-devel >= 1.8
BuildRequires: maven-local
@@ -86,6 +88,10 @@ pushd java
cp %{SOURCE1} core/pom.xml
cp %{SOURCE2} lite/pom.xml
cp %{SOURCE3} util/pom.xml
cp %{SOURCE4} util/%{patchjuname}
pushd util
patch -p0 < %{patchjuname}
popd
%pom_disable_module kotlin
%pom_disable_module kotlin-lite
%pom_remove_plugin :animal-sniffer-maven-plugin

View File

@@ -4,7 +4,7 @@
<parent>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-parent</artifactId>
<version>4.28.3</version>
<version>4.29.3</version>
</parent>
<artifactId>protobuf-javalite</artifactId>

View File

@@ -1,27 +1,12 @@
-------------------------------------------------------------------
Tue Jan 27 08:29:42 UTC 2026 - John Paul Adrian Glaubitz <adrian.glaubitz@suse.com>
Thu May 1 09:05:08 UTC 2025 - Antonello Tartamo <antonello.tartamo@suse.com>
- Delete deprecated google/__init__.py namespace file
-------------------------------------------------------------------
Mon Jan 26 13:00:51 UTC 2026 - John Paul Adrian Glaubitz <adrian.glaubitz@suse.com>
- Add CVE-2026-0994.patch to fix google.protobuf.Any recursion depth
bypass in Python json_format.ParseDict (bsc#1257173, CVE-2026-0994)
-------------------------------------------------------------------
Mon Jan 5 12:14:24 UTC 2026 - John Paul Adrian Glaubitz <adrian.glaubitz@suse.com>
- Cherry-pick protobuf-fix-google-imports.patch to fix import issues of
reverse-dependency packages within the google namespace (bsc#1244918)
-------------------------------------------------------------------
Fri Nov 14 14:32:06 UTC 2025 - John Paul Adrian Glaubitz <adrian.glaubitz@suse.com>
- Add internal-pure-python-fixes.patch to backport changes required for CVE fix
- Add CVE-2025-4565.patch to fix parsing of untrusted Protocol Buffers
data containing an arbitrary number of recursive groups or messages
can lead to crash due to RecursionError (bsc#1244663, CVE-2025-4565)
- update to 29.3
* Fix cmake installation location of java and go features.
* Add .bazeliskrc for protobuf repo to tell bazelisk to use 7.1.2 by default.
* Update artifact actions to v4
* Added protobuf-java-util-removescope.patch to avoid Java compilation errors
due to dependencies marked as runtime.
-------------------------------------------------------------------
Mon Oct 28 08:20:17 UTC 2024 - Dirk Müller <dmueller@suse.com>
@@ -50,6 +35,7 @@ Fri Oct 25 15:24:11 UTC 2024 - Dirk Müller <dmueller@suse.com>
.com/protocolbuffers/protobuf/commit/3ea568a9b6107ebf0d617c47
6f53a31490fd3182)
* Mute the minor version warning
* fixed (bsc#1230778, CVE-2024-7254)
-------------------------------------------------------------------
Thu Oct 24 20:56:51 UTC 2024 - Fridrich Strba <fstrba@suse.com>

View File

@@ -1,7 +1,7 @@
#
# spec file for package protobuf
#
# Copyright (c) 2026 SUSE LLC and contributors
# Copyright (c) 2025 SUSE LLC
# Copyright (c) 2024 Andreas Stieger <Andreas.Stieger@gmx.de>
#
# All modifications and additions to the file contributed by third parties
@@ -20,7 +20,7 @@
%define tarname protobuf
# see cmake/abseil-cpp.cmake and src/google/protobuf/port_def.inc
%define abseil_min_version 20230125.3
%global sover 28_3_0
%global sover 29_3_0
%if 0%{?gcc_version} < 11
%define with_gcc 11
%endif
@@ -66,7 +66,7 @@
%global protoc_arch sparc_64
%endif
Name: protobuf
Version: 28.3
Version: 29.3
Release: 0
Summary: Protocol Buffers - Google's data interchange format
License: BSD-3-Clause
@@ -75,11 +75,6 @@ URL: https://github.com/protocolbuffers/protobuf
Source0: https://github.com/protocolbuffers/protobuf/releases/download/v%{version}/%{tarname}-%{version}.tar.gz
Source1: baselibs.conf
Patch1: versionize-shlibs.patch
# PATCH-FIX-UPSTREAM - Backport changes from 29.x branch required to apply fix for CVE-2025-4565
Patch2: internal-pure-python-fixes.patch
# PATCH-FIX-UPSTREAM - Fix parsing of untrusted Protocol Buffers data containing an arbitrary
# number of recursive groups or messages can lead to crash due to RecursionError (CVE-2025-4565)
Patch3: CVE-2025-4565.patch
BuildRequires: cmake
BuildRequires: fdupes
BuildRequires: gcc%{?with_gcc}-c++
@@ -211,6 +206,7 @@ install -Dm 0644 editors/proto.vim %{buildroot}%{_datadir}/vim/site/syntax/proto
# manual ln that we could not manage to get into versionize-shlibs.patch
ln -s libutf8_range-%{version}.0.so %{buildroot}/%{_libdir}/libutf8_range.so
ln -s libutf8_validity-%{version}.0.so %{buildroot}/%{_libdir}/libutf8_validity.so
install -D java/core/src/main/resources/google/protobuf/java_features.proto %{buildroot}%{_includedir}/java/core/src/main/resources/google/protobuf/java_features.proto
# create maven metadata for the protoc executable
install -dm 0755 %{buildroot}%{_datadir}/maven-metadata

File diff suppressed because it is too large Load Diff

View File

@@ -1,75 +0,0 @@
#
# spec file for package python-protobuf
#
# Copyright (c) 2026 SUSE LLC and contributors
# Copyright (c) 2024 Andreas Stieger <Andreas.Stieger@gmx.de>
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
# upon. The license for this file, and modifications and additions to the
# file, is the same license as for the pristine package itself (unless the
# license for the pristine package is not an Open Source License, in which
# case the license is the MIT License). An "Open Source License" is a
# license that conforms to the Open Source Definition (Version 1.9)
# published by the Open Source Initiative.
# Please submit bugfixes or comments via https://bugs.opensuse.org/
#
%define baseversion 28.3
%{?sle15_python_module_pythons}
Name: python-protobuf
Version: 5.%{baseversion}
Release: 0
Summary: Python Bindings for Google Protocol Buffers
License: BSD-3-Clause
Group: Development/Libraries/Python
URL: https://github.com/protocolbuffers/protobuf
Source0: https://files.pythonhosted.org/packages/source/p/protobuf/protobuf-%{version}.tar.gz
Patch0: https://github.com/protocolbuffers/protobuf/commit/8351926380c7cc91aae6df5695c91426e209f958.patch#/protobuf-fix-google-imports.patch
# PATCH-FIX-UPSTREAM - Fix google.protobuf.Any recursion depth bypass in Python json_format.ParseDict (CVE-2026-0994)
Patch1: CVE-2026-0994.patch
BuildRequires: %{python_module devel}
BuildRequires: %{python_module pip}
BuildRequires: %{python_module python-dateutil}
BuildRequires: %{python_module setuptools}
BuildRequires: %{python_module wheel}
BuildRequires: fdupes
%python_subpackages
%description
Protocol Buffers are a way of encoding structured data in an efficient yet
extensible format. Google uses Protocol Buffers for almost all of its internal
RPC protocols and file formats.
This package contains the Python bindings for Google Protocol Buffers.
%prep
%autosetup -p2 -n protobuf-%{version}
rm -f google/__init__.py
# The previous blank line is crucial for older system being able
# to use the autosetup macro
grep -qF "'%{version}'" google/protobuf/__init__.py
# kill shebang that we do not really want
sed -i -e '/env python/d' google/protobuf/internal/*.py
%build
%pyproject_wheel
%install
%pyproject_install
%python_expand %fdupes %{buildroot}%{$python_sitearch}
%fdupes %{buildroot}%{_prefix}
%files %{python_files}
%license LICENSE
%{python_sitearch}/google
%{python_sitearch}/protobuf*nspkg.pth
%{python_sitearch}/protobuf-%{version}.dist-info
%changelog

View File

@@ -16,10 +16,10 @@ slightly different from PR19009 while the PR is unmerged.
third_party/utf8_range/CMakeLists.txt | 8 ++++++++
1 file changed, 8 insertions(+)
Index: protobuf-28.3/third_party/utf8_range/CMakeLists.txt
Index: protobuf-29.3/third_party/utf8_range/CMakeLists.txt
===================================================================
--- protobuf-28.3.orig/third_party/utf8_range/CMakeLists.txt
+++ protobuf-28.3/third_party/utf8_range/CMakeLists.txt
--- protobuf-29.3.orig/third_party/utf8_range/CMakeLists.txt
+++ protobuf-29.3/third_party/utf8_range/CMakeLists.txt
@@ -19,6 +19,9 @@ add_library (utf8_range
# A heavier-weight C++ wrapper that supports Abseil.
add_library (utf8_validity utf8_validity.cc utf8_range.c)