Compare commits
2 Commits
| Author | SHA256 | Date | |
|---|---|---|---|
| 906241bdac | |||
| ff91514435 |
@@ -1,3 +1,38 @@
|
||||
-------------------------------------------------------------------
|
||||
Tue Jul 29 15:12:29 UTC 2025 - John Paul Adrian Glaubitz <adrian.glaubitz@suse.com>
|
||||
|
||||
- Update to 0.21.4
|
||||
* No change, the 0.21.3 release failed, this is just a re-release
|
||||
- from version 0.21.3
|
||||
* Clippy fixes
|
||||
* Fixed an introduced backward breaking change in our Rust APIs.
|
||||
- from version 0.21.2
|
||||
* Update the release builds following 0.21.1
|
||||
* Replace lazy_static with stabilized std::sync::LazyLock in 1.80
|
||||
* Fix no-onig no-wasm builds
|
||||
* Fix typos in strings and comments
|
||||
* Fix type notation of merges in BPE Python binding
|
||||
* Bump http-proxy-middleware from 2.0.6 to 2.0.9 in
|
||||
/tokenizers/examples/unstable_wasm/www
|
||||
* Fix data path in test_continuing_prefix_trainer_mismatch
|
||||
* clippy by @ArthurZucker
|
||||
* Update pyo3 and rust-numpy depends for no-gil/free-threading compat
|
||||
* Use ApiBuilder::from_env() in from_pretrained function
|
||||
* Upgrade onig, to get it compiling with GCC 15
|
||||
* Itertools upgrade
|
||||
* Bump webpack-dev-server from 4.10.0 to 5.2.1
|
||||
in /tokenizers/examples/unstable_wasm/www
|
||||
* Bump brace-expansion from 1.1.11 to 1.1.12 in /bindings/node
|
||||
* Fix features blending into a paragraph
|
||||
* Adding throughput to benches to have a more consistent measure across
|
||||
* Upgrading dependencies
|
||||
* [docs] Whitespace
|
||||
* Hotfixing the stub
|
||||
* Bpe clones
|
||||
* Fixed Length Pre-Tokenizer
|
||||
* Consolidated optimization ahash dary compact str
|
||||
* Breaking: Fix training with special tokens
|
||||
|
||||
-------------------------------------------------------------------
|
||||
Wed Mar 19 18:26:11 UTC 2025 - Lucas Mulling <lucas.mulling@suse.com>
|
||||
|
||||
|
||||
@@ -22,7 +22,7 @@
|
||||
|
||||
%{?sle15_python_module_pythons}
|
||||
Name: python-tokenizers
|
||||
Version: 0.21.1
|
||||
Version: 0.21.4
|
||||
Release: 0
|
||||
Summary: Provides an implementation of today's most used tokenizers
|
||||
License: Apache-2.0
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:cedae83459a2008d8ab8b27bd2976fadae5df3d4664cddbda0333d510763cd8b
|
||||
size 55696377
|
||||
oid sha256:f91a949651901dc59bc4ca7641ef392d61e6927f0bcf6820d088d6206c7e72c2
|
||||
size 51237248
|
||||
|
||||
@@ -1,3 +0,0 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:65dd1f079404161321ece0d1af4f678e5846e3e6056f004600bde776d489bddc
|
||||
size 1545849
|
||||
3
tokenizers-0.21.4.tar.gz
Normal file
3
tokenizers-0.21.4.tar.gz
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:980f141601d9b6cc2d988657fd87e93c9d8599e3da7b4f5f77e3390a140edba3
|
||||
size 1552701
|
||||
Reference in New Issue
Block a user