ollama/_service
Eyad Issa 2808304cf4 - Update to version 0.3.12:
* Llama 3.2: Meta's Llama 3.2 goes small with 1B and 3B 
    models.
  * Qwen 2.5 Coder: The latest series of Code-Specific Qwen 
    models, with significant improvements in code generation, 
    code reasoning, and code fixing.
  * Ollama now supports ARM Windows machines
  * Fixed rare issue where Ollama would report a missing .dll
    file on Windows
  * Fixed performance issue for Windows without GPUs

OBS-URL: https://build.opensuse.org/package/show/science:machinelearning/ollama?expand=0&rev=53
2024-09-29 21:30:54 +00:00

24 lines
805 B
Plaintext

<services>
<service name="format_spec_file" mode="manual" />
<service name="obs_scm" mode="manual">
<param name="url">https://github.com/ollama/ollama.git</param>
<param name="scm">git</param>
<param name="revision">v0.3.12</param>
<param name="versionformat">@PARENT_TAG@</param>
<param name="versionrewrite-pattern">v(.*)</param>
<param name="changesgenerate">enable</param>
<param name="submodules">enable</param>
<param name="exclude">macapp</param>
<param name="package-meta">yes</param>
</service>
<service name="go_modules" mode="manual">
<param name="compression">zstd</param>
</service>
<service name="set_version" mode="manual" />
<service name="tar" mode="buildtime">
<param name="package-meta">yes</param>
</service>
</services>