Accepting request 1231910 from home:uncomfyhalomacro:branches:science:machinelearning
- Update to version 0.21.0: * More cache options. * Disable caching for long strings. * Testing ABI3 wheels to reduce number of wheels * Adding an API for decode streaming. * Decode stream python * Fix encode_batch and encode_batch_fast to accept ndarrays again OBS-URL: https://build.opensuse.org/request/show/1231910 OBS-URL: https://build.opensuse.org/package/show/science:machinelearning/python-tokenizers?expand=0&rev=8
This commit is contained in:
committed by
Git OBS Bridge
parent
5e74175ce9
commit
82d122dfa5
2
_service
2
_service
@@ -2,7 +2,7 @@
|
||||
<services>
|
||||
<service mode="manual" name="download_files" />
|
||||
<service name="cargo_vendor" mode="manual">
|
||||
<param name="src">tokenizers*.tar.gz</param>
|
||||
<param name="src">tokenizers-*.tar.gz</param>
|
||||
<param name="method">registry</param>
|
||||
<param name="update">true</param>
|
||||
<param name="compression">zst</param>
|
||||
|
||||
@@ -1,3 +1,14 @@
|
||||
-------------------------------------------------------------------
|
||||
Wed Dec 18 14:20:07 UTC 2024 - Soc Virnyl Estela <uncomfyhalomacro@opensuse.org>
|
||||
|
||||
- Update to version 0.21.0:
|
||||
* More cache options.
|
||||
* Disable caching for long strings.
|
||||
* Testing ABI3 wheels to reduce number of wheels
|
||||
* Adding an API for decode streaming.
|
||||
* Decode stream python
|
||||
* Fix encode_batch and encode_batch_fast to accept ndarrays again
|
||||
|
||||
-------------------------------------------------------------------
|
||||
Thu Nov 7 11:30:50 UTC 2024 - Soc Virnyl Estela <uncomfyhalomacro@opensuse.org>
|
||||
|
||||
|
||||
@@ -22,7 +22,7 @@
|
||||
|
||||
%{?sle15_python_module_pythons}
|
||||
Name: python-tokenizers
|
||||
Version: 0.20.3
|
||||
Version: 0.21.0
|
||||
Release: 0
|
||||
Summary: Provides an implementation of today's most used tokenizers
|
||||
License: Apache-2.0
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:24cd5d9d9fa151e42867be3979aaa00ab3ea4ac1fa1bb7650c7b0803986d7cad
|
||||
size 46442951
|
||||
oid sha256:c6e2a7b1f52a7523f92f36544bd7088ee7a96622f493a7c1a4f353c25078bb74
|
||||
size 46535229
|
||||
|
||||
@@ -1,3 +0,0 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:21e9a235c72e49cafb7ed29829650f6a49f0a951194e1f0168b3b2f547362569
|
||||
size 1539739
|
||||
3
tokenizers-0.21.0.tar.gz
Normal file
3
tokenizers-0.21.0.tar.gz
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:841279ad797d575ed3cf31fc4f30e09e37acbd35028d30c51fc0879ef7ed4094
|
||||
size 1544853
|
||||
Reference in New Issue
Block a user