ollama/ollama.spec

88 lines
2.3 KiB
RPMSpec
Raw Normal View History

#
# spec file for package ollama
#
# Copyright (c) 2024 SUSE LLC
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
# upon. The license for this file, and modifications and additions to the
# file, is the same license as for the pristine package itself (unless the
# license for the pristine package is not an Open Source License, in which
# case the license is the MIT License). An "Open Source License" is a
# license that conforms to the Open Source Definition (Version 1.9)
# published by the Open Source Initiative.
# Please submit bugfixes or comments via https://bugs.opensuse.org/
#
Name: ollama
Accepting request 1168020 from home:bmwiedemann:branches:science:machinelearning Update to version 0.1.31: * Backport MacOS SDK fix from main * Apply 01-cache.diff * fix: workflows * stub stub * mangle arch * only generate on changes to llm subdirectory * only generate cuda/rocm when changes to llm detected * Detect arrow keys on windows (#3363) * add license in file header for vendored llama.cpp code (#3351) * remove need for `$VSINSTALLDIR` since build will fail if `ninja` cannot be found (#3350) * change `github.com/jmorganca/ollama` to `github.com/ollama/ollama` (#3347) * malformed markdown link (#3358) * Switch runner for final release job * Use Rocky Linux Vault to get GCC 10.2 installed * Revert "Switch arm cuda base image to centos 7" * Switch arm cuda base image to centos 7 * Bump llama.cpp to b2527 * Fix ROCm link in `development.md` * adds ooo to community integrations (#1623) * Add cliobot to ollama supported list (#1873) * Add Dify.AI to community integrations (#1944) * enh: add ollero.nvim to community applications (#1905) * Add typechat-cli to Terminal apps (#2428) * add new Web & Desktop link in readme for alpaca webui (#2881) * Add LibreChat to Web & Desktop Apps (#2918) * Add Community Integration: OllamaGUI (#2927) * Add Community Integration: OpenAOE (#2946) * Add Saddle (#3178) * tlm added to README.md terminal section. (#3274) ... OBS-URL: https://build.opensuse.org/request/show/1168020 OBS-URL: https://build.opensuse.org/package/show/science:machinelearning/ollama?expand=0&rev=7
2024-04-17 02:53:52 +02:00
Version: 0.1.31
Release: 0
Summary: Tool for running AI models on-premise
License: MIT
URL: https://ollama.com
Source: %{name}-%{version}.tar.gz
Source1: vendor.tar.xz
Source2: ollama.service
Source3: %{name}-user.conf
Patch0: enable-lto.patch
BuildRequires: cmake >= 3.24
BuildRequires: gcc-c++ >= 11.4.0
BuildRequires: git
BuildRequires: sysuser-tools
Accepting request 1168020 from home:bmwiedemann:branches:science:machinelearning Update to version 0.1.31: * Backport MacOS SDK fix from main * Apply 01-cache.diff * fix: workflows * stub stub * mangle arch * only generate on changes to llm subdirectory * only generate cuda/rocm when changes to llm detected * Detect arrow keys on windows (#3363) * add license in file header for vendored llama.cpp code (#3351) * remove need for `$VSINSTALLDIR` since build will fail if `ninja` cannot be found (#3350) * change `github.com/jmorganca/ollama` to `github.com/ollama/ollama` (#3347) * malformed markdown link (#3358) * Switch runner for final release job * Use Rocky Linux Vault to get GCC 10.2 installed * Revert "Switch arm cuda base image to centos 7" * Switch arm cuda base image to centos 7 * Bump llama.cpp to b2527 * Fix ROCm link in `development.md` * adds ooo to community integrations (#1623) * Add cliobot to ollama supported list (#1873) * Add Dify.AI to community integrations (#1944) * enh: add ollero.nvim to community applications (#1905) * Add typechat-cli to Terminal apps (#2428) * add new Web & Desktop link in readme for alpaca webui (#2881) * Add LibreChat to Web & Desktop Apps (#2918) * Add Community Integration: OllamaGUI (#2927) * Add Community Integration: OpenAOE (#2946) * Add Saddle (#3178) * tlm added to README.md terminal section. (#3274) ... OBS-URL: https://build.opensuse.org/request/show/1168020 OBS-URL: https://build.opensuse.org/package/show/science:machinelearning/ollama?expand=0&rev=7
2024-04-17 02:53:52 +02:00
BuildRequires: golang(API) >= 1.22
%{sysusers_requires}
%description
Ollama is a tool for running AI models on one's own hardware.
It offers a command-line interface and a RESTful API.
New models can be created or existing ones modified in the
Ollama library using the Modelfile syntax.
Source model weights found on Hugging Face and similar sites
can be imported.
%prep
%autosetup -a1 -p1
%build
%sysusers_generate_pre %{SOURCE3} %{name} %{name}-user.conf
%ifnarch ppc64
export GOFLAGS="-buildmode=pie -mod=vendor"
%endif
export OLLAMA_SKIP_PATCHING=1
go generate ./...
go build .
%install
install -D -m 0755 %{name} %{buildroot}/%{_bindir}/%{name}
install -D -m 0644 %{SOURCE2} %{buildroot}%{_unitdir}/%{name}.service
install -D -m 0644 %{SOURCE3} %{buildroot}%{_sysusersdir}/%{name}-user.conf
install -d %{buildroot}/var/lib/%{name}
%pre -f %{name}.pre
%service_add_pre %{name}.service
%post
%service_add_post %{name}.service
%preun
%service_del_preun %{name}.service
%postun
%service_del_postun %{name}.service
%files
%doc README.md
%license LICENSE
%{_bindir}/%{name}
%{_unitdir}/%{name}.service
%{_sysusersdir}/%{name}-user.conf
%attr(-, ollama, ollama) /var/lib/%{name}
%changelog