This website requires JavaScript.
Explore
Help
Sign In
VaiTon
/
ollama
SHA256
Watch
1
Star
0
Fork
0
You've already forked ollama
forked from
pool/ollama
Code
Pull Requests
Activity
32d8d25838
ollama
/
ollama.obsinfo
5 lines
96 B
Plaintext
Raw
Normal View
History
Unescape
Escape
Accepting request 1173521 from home:VaiTon:branches:science:machinelearning - Update to version 0.1.37: * Fixed issue where models with uppercase characters in the name would not show with ollama list * Fixed usage string for ollama create * Fix finish_reason being "" instead of null in the Open-AI compatible chat API. - Use obs_scm service instead of the deprecated tar_scm - Use zstd for vendor tarball compression OBS-URL: https://build.opensuse.org/request/show/1173521 OBS-URL: https://build.opensuse.org/package/show/science:machinelearning/ollama?expand=0&rev=13
2024-05-13 05:27:47 +02:00
name: ollama
Accepting request 1174682 from home:VaiTon:branches:science:machinelearning - Update to version 0.1.38: * New model: Falcon 2: A new 11B parameters causal decoder-only model built by TII and trained over 5T tokens. * New model: Yi 1.5: A new high-performing version of Yi, now licensed as Apache 2.0. Available in 6B, 9B and 34B sizes. * Added ollama ps command * Added /clear command * Fixed issue where switching loaded models on Windows would take several seconds * Running /save will no longer abort the chat session if an incorrect name is provided * The /api/tags API endpoint will now correctly return an empty list [] instead of null if no models are provided OBS-URL: https://build.opensuse.org/request/show/1174682 OBS-URL: https://build.opensuse.org/package/show/science:machinelearning/ollama?expand=0&rev=15
2024-05-16 22:34:11 +02:00
version: 0.1.38
mtime: 1715812996
commit: d1692fd3e0b4a80ff55ba052b430207134df4714
Copy Permalink