Compare commits
2 Commits
| Author | SHA256 | Date | |
|---|---|---|---|
| 18d41bfaa3 | |||
| 2acc3720ee |
2
_service
2
_service
@@ -4,6 +4,6 @@
|
|||||||
|
|
||||||
<service name="go_modules" mode="manual">
|
<service name="go_modules" mode="manual">
|
||||||
<param name="compression">zstd</param>
|
<param name="compression">zstd</param>
|
||||||
<param name="replace">golang.org/x/net=golang.org/x/net@v0.46.0</param>
|
<param name="replace">golang.org/x/net=golang.org/x/net@v0.48.0</param>
|
||||||
</service>
|
</service>
|
||||||
</services>
|
</services>
|
||||||
|
|||||||
BIN
ollama-0.12.10.tar.gz
LFS
BIN
ollama-0.12.10.tar.gz
LFS
Binary file not shown.
BIN
ollama-0.13.5.tar.gz
LFS
Normal file
BIN
ollama-0.13.5.tar.gz
LFS
Normal file
Binary file not shown.
@@ -1,3 +1,89 @@
|
|||||||
|
-------------------------------------------------------------------
|
||||||
|
Fri Dec 19 12:01:05 UTC 2025 - Glen Masgai <glen.masgai@gmail.com>
|
||||||
|
|
||||||
|
- Added 'Requires:' tag for subpackages to spec file
|
||||||
|
|
||||||
|
- Update to version 0.13.5:
|
||||||
|
* New models: FunctionGemma
|
||||||
|
* 'bert' architecture models now run on Ollama's engine
|
||||||
|
* Added built-in renderer & tool parsing capabilities for
|
||||||
|
DeepSeek-V3.1
|
||||||
|
* Fixed issue where nested properties in tools may not have been
|
||||||
|
rendered properly
|
||||||
|
|
||||||
|
-------------------------------------------------------------------
|
||||||
|
Wed Dec 17 11:48:24 UTC 2025 - Glen Masgai <glen.masgai@gmail.com>
|
||||||
|
|
||||||
|
- Update vendored golang.org/x/net/html to v0.48.0
|
||||||
|
|
||||||
|
- Update to version 0.13.4:
|
||||||
|
* New models: Nemotron 3 Nano, Olmo 3, Olmo 3.1
|
||||||
|
* Enable Flash Attention automatically for models by default
|
||||||
|
* Fixed handling of long contexts with Gemma 3 models
|
||||||
|
* Fixed issue that would occur with Gemma 3 QAT models or
|
||||||
|
other models imported with the Gemma 3 architecture
|
||||||
|
|
||||||
|
- Update to version 0.13.3:
|
||||||
|
* New models: Devstral-Small-2, rnj-1, nomic-embed-text-v2
|
||||||
|
* Improved truncation logic when using /api/embed and
|
||||||
|
/v1/embeddings
|
||||||
|
* Extend Gemma 3 architecture to support rnj-1 model
|
||||||
|
* Fix error that would occur when running qwen2.5vl with image
|
||||||
|
input
|
||||||
|
|
||||||
|
- Update to version 0.13.2:
|
||||||
|
* New models: Qwen3-Next
|
||||||
|
* Flash attention is now enabled by default for vision models
|
||||||
|
such as mistral-3, gemma3, qwen3-vl and more. This improves
|
||||||
|
memory utilization and performance when providing images as
|
||||||
|
input.
|
||||||
|
* Fixed GPU detection on multi-GPU CUDA machines
|
||||||
|
* Fixed issue where deepseek-v3.1 would always think even with
|
||||||
|
thinking is disabled in Ollama's app
|
||||||
|
|
||||||
|
-------------------------------------------------------------------
|
||||||
|
Thu Dec 4 18:07:05 UTC 2025 - Eyad Issa <eyadlorenzo@gmail.com>
|
||||||
|
|
||||||
|
- Update to version 0.13.1:
|
||||||
|
* New models: Ministral-3, Mistral-Large-3
|
||||||
|
* nomic-embed-text will now use Ollama's engine by default
|
||||||
|
* Tool calling support for cogito-v2.1
|
||||||
|
* Ollama will now better render errors instead of showing
|
||||||
|
Unmarshal: errors
|
||||||
|
|
||||||
|
-------------------------------------------------------------------
|
||||||
|
Sat Nov 22 04:14:47 UTC 2025 - Glen Masgai <glen.masgai@gmail.com>
|
||||||
|
|
||||||
|
- Update to version 0.13.0
|
||||||
|
* New models: DeepSeek-OCR, Cogito-V2.1
|
||||||
|
* DeepSeek-V3.1 architecture is now supported in Ollama's engine
|
||||||
|
* Fixed performance issues that arose in Ollama 0.12.11 on CUDA
|
||||||
|
* Fixed issue where Linux install packages were missing required
|
||||||
|
Vulkan libraries
|
||||||
|
* Improved CPU and memory detection while in containers/cgroups
|
||||||
|
* Improved VRAM information detection for AMD GPUs
|
||||||
|
* Improved KV cache performance to no longer require
|
||||||
|
defragmentation
|
||||||
|
|
||||||
|
- Update to version 0.12.11
|
||||||
|
* Ollama's API and the OpenAI-compatible API now supports
|
||||||
|
Logprobs, see:
|
||||||
|
https://cookbook.openai.com/examples/using_logprobs) and
|
||||||
|
https://github.com/ollama/ollama/releases/tag/v0.12.11
|
||||||
|
* Ollama's new app now supports WebP images
|
||||||
|
* Improved rendering performance in Ollama's new app, especially
|
||||||
|
when rendering code
|
||||||
|
* The "required" field in tool definitions will now be omitted if
|
||||||
|
not specified
|
||||||
|
* Fixed issue where "tool_call_id" would be omitted when using
|
||||||
|
the OpenAI-compatible API.
|
||||||
|
* Fixed issue where ollama create would import data from both
|
||||||
|
consolidated.safetensors and other safetensor files.
|
||||||
|
* Ollama will now prefer dedicated GPUs over iGPUs when
|
||||||
|
scheduling models
|
||||||
|
* Vulkan can now be enabled by setting OLLAMA_VULKAN=1.
|
||||||
|
For example: OLLAMA_VULKAN=1 ollama serve
|
||||||
|
|
||||||
-------------------------------------------------------------------
|
-------------------------------------------------------------------
|
||||||
Mon Nov 10 19:34:43 UTC 2025 - Egbert Eich <eich@suse.com>
|
Mon Nov 10 19:34:43 UTC 2025 - Egbert Eich <eich@suse.com>
|
||||||
|
|
||||||
@@ -25,6 +111,7 @@ Fri Nov 7 15:40:39 UTC 2025 - Glen Masgai <glen.masgai@gmail.com>
|
|||||||
|
|
||||||
-------------------------------------------------------------------
|
-------------------------------------------------------------------
|
||||||
Sun Nov 2 04:00:05 UTC 2025 - Glen Masgai <glen.masgai@gmail.com>
|
Sun Nov 2 04:00:05 UTC 2025 - Glen Masgai <glen.masgai@gmail.com>
|
||||||
|
|
||||||
- Fixed issue with duplicated libraries (/usr/lib, /usr/lib64)
|
- Fixed issue with duplicated libraries (/usr/lib, /usr/lib64)
|
||||||
|
|
||||||
- Update to version 0.12.9
|
- Update to version 0.12.9
|
||||||
|
|||||||
@@ -35,7 +35,7 @@
|
|||||||
%define cuda_version %{cuda_version_major}-%{cuda_version_minor}
|
%define cuda_version %{cuda_version_major}-%{cuda_version_minor}
|
||||||
|
|
||||||
Name: ollama
|
Name: ollama
|
||||||
Version: 0.12.10
|
Version: 0.13.5
|
||||||
Release: 0
|
Release: 0
|
||||||
Summary: Tool for running AI models on-premise
|
Summary: Tool for running AI models on-premise
|
||||||
License: MIT
|
License: MIT
|
||||||
@@ -102,18 +102,21 @@ can be imported.
|
|||||||
|
|
||||||
%package vulkan
|
%package vulkan
|
||||||
Summary: Ollama Module using Vulkan
|
Summary: Ollama Module using Vulkan
|
||||||
|
Requires: %{name} = %{version}-%{release}
|
||||||
|
|
||||||
%description vulkan
|
%description vulkan
|
||||||
Ollama plugin module using Vulkan.
|
Ollama plugin module using Vulkan.
|
||||||
|
|
||||||
%package cuda
|
%package cuda
|
||||||
Summary: Ollama Module using CUDA
|
Summary: Ollama Module using CUDA
|
||||||
|
Requires: %{name} = %{version}-%{release}
|
||||||
|
|
||||||
%description cuda
|
%description cuda
|
||||||
Ollama plugin module using NVIDIA CUDA.
|
Ollama plugin module using NVIDIA CUDA.
|
||||||
|
|
||||||
%package rocm
|
%package rocm
|
||||||
Summary: Ollama Module using AMD ROCm
|
Summary: Ollama Module using AMD ROCm
|
||||||
|
Requires: %{name} = %{version}-%{release}
|
||||||
|
|
||||||
%description rocm
|
%description rocm
|
||||||
Ollama plugin module for ROCm.
|
Ollama plugin module for ROCm.
|
||||||
@@ -157,7 +160,8 @@ sed -i -e 's@"lib"@"%{_lib}"@' \
|
|||||||
-UOLLAMA_INSTALL_DIR -DOLLAMA_INSTALL_DIR=%{_libdir}/ollama \
|
-UOLLAMA_INSTALL_DIR -DOLLAMA_INSTALL_DIR=%{_libdir}/ollama \
|
||||||
-UCMAKE_INSTALL_BINDIR -DCMAKE_INSTALL_BINDIR=%{_libdir}/ollama \
|
-UCMAKE_INSTALL_BINDIR -DCMAKE_INSTALL_BINDIR=%{_libdir}/ollama \
|
||||||
-DGGML_BACKEND_DIR=%{_libdir}/ollama \
|
-DGGML_BACKEND_DIR=%{_libdir}/ollama \
|
||||||
%{?with_cuda:-DCMAKE_CUDA_COMPILER=/usr/local/cuda-%{cuda_version_major}.%{cuda_version_minor}/bin/nvcc} \
|
%{?with_cuda:-DCMAKE_CUDA_COMPILER=/usr/local/cuda-%{cuda_version_major}.%{cuda_version_minor}/bin/nvcc \
|
||||||
|
-DCMAKE_CUDA_ARCHITECTURES=all} \
|
||||||
%{?with_rocm:-DCMAKE_HIP_COMPILER=%rocmllvm_bindir/clang++ \
|
%{?with_rocm:-DCMAKE_HIP_COMPILER=%rocmllvm_bindir/clang++ \
|
||||||
-DAMDGPU_TARGETS=%{rocm_gpu_list_default}} \
|
-DAMDGPU_TARGETS=%{rocm_gpu_list_default}} \
|
||||||
%{nil}
|
%{nil}
|
||||||
@@ -165,6 +169,7 @@ sed -i -e 's@"lib"@"%{_lib}"@' \
|
|||||||
|
|
||||||
cd ..
|
cd ..
|
||||||
go build -trimpath -o %{name} .
|
go build -trimpath -o %{name} .
|
||||||
|
exit 1
|
||||||
|
|
||||||
%install
|
%install
|
||||||
%cmake_install
|
%cmake_install
|
||||||
|
|||||||
BIN
vendor.tar.zstd
LFS
BIN
vendor.tar.zstd
LFS
Binary file not shown.
Reference in New Issue
Block a user