Eyad Issa
5bb20bbdee
* llm: add solar pro (preview) (#6846) * server: add tool parsing support for nemotron-mini (#6849) * make patches git am-able * CI: dist directories no longer present (#6834) * CI: clean up naming, fix tagging latest (#6832) * CI: set platform build build_linux script to keep buildx happy (#6829) * readme: add Agents-Flex to community integrations (#6788) * fix typo in import docs (#6828) * readme: add vim-intelligence-bridge to Terminal section (#6818) * readme: add Obsidian Quiz Generator plugin to community integrations (#6789) * Fix incremental builds on linux (#6780) * Use GOARCH for build dirs (#6779) * Optimize container images for startup (#6547) * examples: updated requirements.txt for privategpt example * examples: polish loganalyzer example (#6744) * readme: add ollama_moe to community integrations (#6752) * runner: Flush pending responses before returning * add "stop" command (#6739) * refactor show ouput * readme: add QodeAssist to community integrations (#6754) * Verify permissions for AMD GPU (#6736) * add *_proxy for debugging * docs: update examples to use llama3.1 (#6718) * Quiet down dockers new lint warnings (#6716) * catch when model vocab size is set correctly (#6714) * readme: add crewAI to community integrations (#6699) * readme: add crewAI with mesop to community integrations OBS-URL: https://build.opensuse.org/package/show/science:machinelearning/ollama?expand=0&rev=51
4 lines
234 B
Plaintext
4 lines
234 B
Plaintext
<servicedata>
|
|
<service name="tar_scm">
|
|
<param name="url">https://github.com/ollama/ollama.git</param>
|
|
<param name="changesrevision">504a410f02e01a2ec948a92e4579a28295184898</param></service></servicedata> |