SHA256
1
0
forked from pool/ollama
ollama/_service
Loren Burkholder 32d8d25838 Accepting request 1174682 from home:VaiTon:branches:science:machinelearning
- Update to version 0.1.38:
  * New model: Falcon 2: A new 11B parameters causal decoder-only
    model built by TII and trained over 5T tokens.
  * New model: Yi 1.5: A new high-performing version of Yi, now 
    licensed as Apache 2.0. Available in 6B, 9B and 34B sizes.
  * Added ollama ps command
  * Added /clear command
  * Fixed issue where switching loaded models on Windows would take
    several seconds
  * Running /save will no longer abort the chat session if an
    incorrect name is provided
  * The /api/tags API endpoint will now correctly return an empty
    list [] instead of null if no models are provided

OBS-URL: https://build.opensuse.org/request/show/1174682
OBS-URL: https://build.opensuse.org/package/show/science:machinelearning/ollama?expand=0&rev=15
2024-05-16 20:34:11 +00:00

24 lines
805 B
Plaintext

<services>
<service name="format_spec_file" mode="manual" />
<service name="obs_scm" mode="manual">
<param name="url">https://github.com/ollama/ollama.git</param>
<param name="scm">git</param>
<param name="revision">v0.1.38</param>
<param name="versionformat">@PARENT_TAG@</param>
<param name="versionrewrite-pattern">v(.*)</param>
<param name="changesgenerate">enable</param>
<param name="submodules">enable</param>
<param name="exclude">macapp</param>
<param name="package-meta">yes</param>
</service>
<service name="go_modules" mode="manual">
<param name="compression">zstd</param>
</service>
<service name="set_version" mode="manual" />
<service name="tar" mode="buildtime">
<param name="package-meta">yes</param>
</service>
</services>