* New Models + Granite 3 MoE: The IBM Granite 1B and 3B models are the first mixture of experts (MoE) Granite models from IBM designed for low latency usage. + Granite 3 Dense: The IBM Granite 2B and 8B models are designed to support tool-based use cases and support for retrieval augmented generation (RAG), streamlining code generation, translation and bug fixing. OBS-URL: https://build.opensuse.org/package/show/science:machinelearning/ollama?expand=0&rev=57
210 MiB (Stored with Git LFS)
210 MiB (Stored with Git LFS)
The file is too large to be shown.
View Raw