From 9c6d1dfa9234d732d41180c0eb34b7c28fe54580d081b22c71ae572e6d56ced5 Mon Sep 17 00:00:00 2001 From: Loren Burkholder Date: Wed, 6 Mar 2024 23:53:38 +0000 Subject: [PATCH] - Update to version 0.1.28: * Fix embeddings load model behavior (#2848) * Add Community Integration: NextChat (#2780) * prepend image tags (#2789) * fix: print usedMemory size right (#2827) * bump submodule to `87c91c07663b707e831c59ec373b5e665ff9d64a` (#2828) * Add ollama user to video group * Add env var so podman will map cuda GPUs * Omit build date from gzip headers * Log unexpected server errors checking for update * Refine container image build script * Bump llama.cpp to b2276 * Determine max VRAM on macOS using `recommendedMaxWorkingSetSize` (#2354) * Update types.go (#2744) * Update langchain python tutorial (#2737) * no extra disk space for windows installation (#2739) * clean up go.mod * remove format/openssh.go * Add Community Integration: Chatbox * better directory cleanup in `ollama.iss` * restore windows build flags and compression OBS-URL: https://build.opensuse.org/package/show/science:machinelearning/ollama?expand=0&rev=6 --- _service | 2 +- ollama-0.1.28.tar.gz | 3 +++ ollama.changes | 12 ++++++++++++ ollama.spec | 3 ++- vendor.tar.xz | 4 ++-- 5 files changed, 20 insertions(+), 4 deletions(-) create mode 100644 ollama-0.1.28.tar.gz diff --git a/_service b/_service index 68a33a1..35c0b4e 100644 --- a/_service +++ b/_service @@ -4,7 +4,7 @@ https://github.com/ollama/ollama.git git - v0.1.27 + v0.1.28 @PARENT_TAG@ v(.*) enable diff --git a/ollama-0.1.28.tar.gz b/ollama-0.1.28.tar.gz new file mode 100644 index 0000000..28355a7 --- /dev/null +++ b/ollama-0.1.28.tar.gz @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:30225f7d1a96b8a573e82810584950ed2f9c95dcd2157d794c278ca44a43861b +size 75624882 diff --git a/ollama.changes b/ollama.changes index bec9b83..0d124e7 100644 --- a/ollama.changes +++ b/ollama.changes @@ -1,3 +1,15 @@ +------------------------------------------------------------------- +Wed Mar 06 23:51:28 UTC 2024 - computersemiexpert@outlook.com + +- Update to version 0.1.28: + * Fix embeddings load model behavior (#2848) + * Add Community Integration: NextChat (#2780) + * prepend image tags (#2789) + * fix: print usedMemory size right (#2827) + * bump submodule to `87c91c07663b707e831c59ec373b5e665ff9d64a` (#2828) + * Add ollama user to video group + * Add env var so podman will map cuda GPUs + ------------------------------------------------------------------- Tue Feb 27 08:33:15 UTC 2024 - Jan Engelhardt diff --git a/ollama.spec b/ollama.spec index d43cf7a..0860883 100644 --- a/ollama.spec +++ b/ollama.spec @@ -15,8 +15,9 @@ # Please submit bugfixes or comments via https://bugs.opensuse.org/ # + Name: ollama -Version: 0.1.27 +Version: 0.1.28 Release: 0 Summary: Tool for running AI models on-premise License: MIT diff --git a/vendor.tar.xz b/vendor.tar.xz index cd8ae41..363d08a 100644 --- a/vendor.tar.xz +++ b/vendor.tar.xz @@ -1,3 +1,3 @@ version https://git-lfs.github.com/spec/v1 -oid sha256:1668fa3db9f05fbb58eaf3e9200bd23ac93991cdff56234fac154296acc4e419 -size 2995404 +oid sha256:548f8d5870f6b0b2881f5f68ef3f45c8b77bba282e10dc1aecafe14396213327 +size 2993296