Accepting request 1202264 from science:machinelearning

OBS-URL: https://build.opensuse.org/request/show/1202264
OBS-URL: https://build.opensuse.org/package/show/openSUSE:Factory/ollama?expand=0&rev=20
This commit is contained in:
Ana Guerrero 2024-09-22 09:06:09 +00:00 committed by Git OBS Bridge
commit f7aaf9b2af
8 changed files with 43 additions and 11 deletions

View File

@ -3,7 +3,7 @@
<service name="obs_scm" mode="manual">
<param name="url">https://github.com/ollama/ollama.git</param>
<param name="scm">git</param>
<param name="revision">v0.3.10</param>
<param name="revision">v0.3.11</param>
<param name="versionformat">@PARENT_TAG@</param>
<param name="versionrewrite-pattern">v(.*)</param>
<param name="changesgenerate">enable</param>

View File

@ -1,4 +1,4 @@
<servicedata>
<service name="tar_scm">
<param name="url">https://github.com/ollama/ollama.git</param>
<param name="changesrevision">06d4fba851b91eb55da892d23834e8fe75096ca7</param></service></servicedata>
<param name="changesrevision">504a410f02e01a2ec948a92e4579a28295184898</param></service></servicedata>

BIN
ollama-0.3.10.obscpio (Stored with Git LFS)

Binary file not shown.

BIN
ollama-0.3.11.obscpio (Stored with Git LFS) Normal file

Binary file not shown.

View File

@ -1,3 +1,35 @@
-------------------------------------------------------------------
Fri Sep 20 08:29:30 UTC 2024 - adrian@suse.de
- Update to version 0.3.11:
* llm: add solar pro (preview) (#6846)
* server: add tool parsing support for nemotron-mini (#6849)
* make patches git am-able
* CI: dist directories no longer present (#6834)
* CI: clean up naming, fix tagging latest (#6832)
* CI: set platform build build_linux script to keep buildx happy (#6829)
* readme: add Agents-Flex to community integrations (#6788)
* fix typo in import docs (#6828)
* readme: add vim-intelligence-bridge to Terminal section (#6818)
* readme: add Obsidian Quiz Generator plugin to community integrations (#6789)
* Fix incremental builds on linux (#6780)
* Use GOARCH for build dirs (#6779)
* Optimize container images for startup (#6547)
* examples: updated requirements.txt for privategpt example
* examples: polish loganalyzer example (#6744)
* readme: add ollama_moe to community integrations (#6752)
* runner: Flush pending responses before returning
* add "stop" command (#6739)
* refactor show ouput
* readme: add QodeAssist to community integrations (#6754)
* Verify permissions for AMD GPU (#6736)
* add *_proxy for debugging
* docs: update examples to use llama3.1 (#6718)
* Quiet down dockers new lint warnings (#6716)
* catch when model vocab size is set correctly (#6714)
* readme: add crewAI to community integrations (#6699)
* readme: add crewAI with mesop to community integrations
-------------------------------------------------------------------
Tue Sep 17 10:48:34 UTC 2024 - adrian@suse.de

View File

@ -1,4 +1,4 @@
name: ollama
version: 0.3.10
mtime: 1725725288
commit: 06d4fba851b91eb55da892d23834e8fe75096ca7
version: 0.3.11
mtime: 1726621886
commit: 504a410f02e01a2ec948a92e4579a28295184898

View File

@ -17,7 +17,7 @@
Name: ollama
Version: 0.3.10
Version: 0.3.11
Release: 0
Summary: Tool for running AI models on-premise
License: MIT

BIN
vendor.tar.zstd (Stored with Git LFS)

Binary file not shown.