Eyad Issa
5a882751e3
* New Models + Granite 3 MoE: The IBM Granite 1B and 3B models are the first mixture of experts (MoE) Granite models from IBM designed for low latency usage. + Granite 3 Dense: The IBM Granite 2B and 8B models are designed to support tool-based use cases and support for retrieval augmented generation (RAG), streamlining code generation, translation and bug fixing. OBS-URL: https://build.opensuse.org/package/show/science:machinelearning/ollama?expand=0&rev=57
4 lines
234 B
Plaintext
4 lines
234 B
Plaintext
<servicedata>
|
|
<service name="tar_scm">
|
|
<param name="url">https://github.com/ollama/ollama.git</param>
|
|
<param name="changesrevision">f2890a4494f9fb3722ee7a4c506252362d1eab65</param></service></servicedata> |