Guillaume GARDET
fd89371cd6
Because there is an incompatibility of the source code of the level-zero library and npu module with gcc-14. I am working with Intel on tests to return to native gcc. - Update to 2024.4.0 - Summary of major features and improvements * More Gen AI coverage and framework integrations to minimize code changes + Support for GLM-4-9B Chat, MiniCPM-1B, Llama 3 and 3.1, Phi-3-Mini, Phi-3-Medium and YOLOX-s models. + Noteworthy notebooks added: Florence-2, NuExtract-tiny Structure Extraction, Flux.1 Image Generation, PixArt-α: Photorealistic Text-to-Image Synthesis, and Phi-3-Vision Visual Language Assistant. * Broader Large Language Model (LLM) support and more model compression techniques. + OpenVINO™ runtime optimized for Intel® Xe Matrix Extensions (Intel® XMX) systolic arrays on built-in GPUs for efficient matrix multiplication resulting in significant LLM performance boost with improved 1st and 2nd token latency, as well as a smaller memory footprint on Intel® Core™ Ultra Processors (Series 2). + Memory sharing enabled for NPUs on Intel® Core™ Ultra Processors (Series 2) for efficient pipeline integration without memory copy overhead. + Addition of the PagedAttention feature for discrete GPUs* enables a significant boost in throughput for parallel inferencing when serving LLMs on Intel® Arc™ Graphics or Intel® Data Center GPU Flex Series. * More portability and performance to run AI at the edge, in the cloud, or locally. + OpenVINO™ Model Server now comes with production-quality support for OpenAI-compatible API which enables i significantly higher throughput for parallel inferencing on Intel® Xeon® processors when serving LLMs to many concurrent users. + Improved performance and memory consumption with prefix caching, KV cache compression, and other optimizations for serving LLMs using OpenVINO™ Model Server. + Support for Python 3.12. - Support Change and Deprecation Notices * Using deprecated features and components is not advised. They are available to enable a smooth transition to new solutions and will be discontinued in the future. To keep using discontinued features, you will have to revert to the last LTS OpenVINO version supporting them. For more details, refer to the OpenVINO Legacy Features and Components page. * Discontinued in 2024.0: + Runtime components: - Intel® Gaussian & Neural Accelerator (Intel® GNA). Consider using the Neural Processing Unit (NPU) for low-powered systems like Intel® Core™ Ultra or 14th generation and beyond. - OpenVINO C++/C/Python 1.0 APIs (see 2023.3 API transition guide for reference). - All ONNX Frontend legacy API (known as ONNX_IMPORTER_API) -'PerfomanceMode.UNDEFINED' property as part of the OpenVINO Python API + Tools: - Deployment Manager. See installation and deployment guides for current distribution options. - Accuracy Checker. - Post-Training Optimization Tool (POT). Neural Network Compression Framework (NNCF) should be used instead. - A Git patch for NNCF integration with huggingface/ transformers. The recommended approach is to use huggingface/optimum-intel for applying NNCF optimization on top of models from Hugging Face. - Support for Apache MXNet, Caffe, and Kaldi model formats. Conversion to ONNX may be used as a solution. * Deprecated and to be removed in the future: + The macOS x86_64 debug bins will no longer be provided with the OpenVINO toolkit, starting with OpenVINO 2024.5. + Python 3.8 is now considered deprecated, and it will not be available beyond the 2024.4 OpenVINO version. + dKMB support is now considered deprecated and will be fully removed with OpenVINO 2024.5 + Intel® Streaming SIMD Extensions (Intel® SSE) will be supported in source code form, but not enabled in the binary package by default, starting with OpenVINO 2025.0 + The openvino-nightly PyPI module will soon be discontinued. End-users should proceed with the Simple PyPI nightly repo instead. More information in Release Policy. + The OpenVINO™ Development Tools package (pip install openvino-dev) will be removed from installation options and distribution channels beginning with OpenVINO 2025.0. + Model Optimizer will be discontinued with OpenVINO 2025.0. Consider using the new conversion methods instead. For more details, see the model conversion transition guide. + OpenVINO property Affinity API will be discontinued with OpenVINO 2025.0. It will be replaced with CPU binding configurations (ov::hint::enable_cpu_pinning). + OpenVINO Model Server components: - “auto shape” and “auto batch size” (reshaping a model in runtime) will be removed in the future. OpenVINO’s dynamic shape models are recommended instead. + A number of notebooks have been deprecated. For an up-to-date listing of available notebooks, refer to the OpenVINO™ Notebook index (openvinotoolkit.github.io). OBS-URL: https://build.opensuse.org/package/show/science:machinelearning/openvino?expand=0&rev=19
440 lines
15 KiB
RPMSpec
440 lines
15 KiB
RPMSpec
#
|
|
# spec file for package openvino
|
|
#
|
|
# Copyright (c) 2024 SUSE LLC
|
|
# Copyright (c) 2024 Alessandro de Oliveira Faria (A.K.A. CABELO) <cabelo@opensuse.org> or <alessandro.faria@owasp.org>
|
|
#
|
|
# All modifications and additions to the file contributed by third parties
|
|
# remain the property of their copyright owners, unless otherwise agreed
|
|
# upon. The license for this file, and modifications and additions to the
|
|
# file, is the same license as for the pristine package itself (unless the
|
|
# license for the pristine package is not an Open Source License, in which
|
|
# case the license is the MIT License). An "Open Source License" is a
|
|
# license that conforms to the Open Source Definition (Version 1.9)
|
|
# published by the Open Source Initiative.
|
|
|
|
# Please submit bugfixes or comments via https://bugs.opensuse.org/
|
|
#
|
|
|
|
|
|
%if 0%{?suse_version} < 1600
|
|
%define isLeap15 %nil
|
|
%else
|
|
%undefine isLeap15
|
|
%endif
|
|
|
|
# Compilation takes ~1 hr on OBS for a single python, don't try all supported flavours
|
|
%if %{defined isLeap15}
|
|
%define x86_64 x86_64
|
|
%define pythons python311
|
|
%else
|
|
%define pythons python3
|
|
%endif
|
|
%define __builder ninja
|
|
%define so_ver 2440
|
|
%define shlib lib%{name}%{so_ver}
|
|
%define shlib_c lib%{name}_c%{so_ver}
|
|
%define prj_name OpenVINO
|
|
|
|
Name: openvino
|
|
Version: 2024.4.0
|
|
Release: 0
|
|
Summary: A toolkit for optimizing and deploying AI inference
|
|
# Let's be safe and put all third party licenses here, no matter that we use specific thirdparty libs or not
|
|
License: Apache-2.0 AND BSD-2-Clause AND BSD-3-Clause AND HPND AND JSON AND MIT AND OFL-1.1 AND Zlib
|
|
URL: https://github.com/openvinotoolkit/openvino
|
|
Source0: %{name}-%{version}.tar.zst
|
|
Source1: %{name}-rpmlintrc
|
|
# PATCH-FEATURE-OPENSUSE openvino-onnx-ml-defines.patch badshah400@gmail.com -- Define ONNX_ML at compile time when using system onnx to allow using 'onnx-ml.pb.h' instead of 'onnx.pb.h', the latter not being shipped with openSUSE's onnx-devel package
|
|
Patch0: openvino-onnx-ml-defines.patch
|
|
# PATCH-FEATURE-OPENSUSE openvino-fix-install-paths.patch badshah400@gmail.com -- Fix installation paths hardcoded into upstream defined cmake macros
|
|
Patch2: openvino-fix-install-paths.patch
|
|
# PATCH-FIX-UPSTREAM openvino-ComputeLibrary-include-string.patch badshah400@gmail.com -- Include header for std::string
|
|
Patch3: openvino-ComputeLibrary-include-string.patch
|
|
# PATCH-FIX-UPSTREAM openvino-fix-build-sample-path.patch cabelo@opensuse.org -- Fix sample source path in build script
|
|
Patch4: openvino-fix-build-sample-path.patch
|
|
# PATCH-FIX-UPSTREAM openvino-remove-npu-compile-tool.patch cabelo@opensuse.org -- Remove NPU Compile Tool
|
|
Patch5: openvino-remove-npu-compile-tool.patch
|
|
|
|
BuildRequires: ade-devel
|
|
BuildRequires: cmake
|
|
BuildRequires: fdupes
|
|
BuildRequires: gcc13-c++
|
|
BuildRequires: ninja
|
|
BuildRequires: opencl-cpp-headers
|
|
# FIXME: /usr/include/onnx/onnx-ml.pb.h:17:2: error: This file was generated by
|
|
# an older version of protoc which is incompatible with your Protocol Buffer
|
|
# headers. Please regenerate this file with a newer version of protoc.
|
|
#BuildRequires: cmake(ONNX)
|
|
BuildRequires: pkgconfig
|
|
BuildRequires: %{python_module devel}
|
|
BuildRequires: %{python_module pip}
|
|
BuildRequires: %{python_module pybind11-devel}
|
|
BuildRequires: %{python_module setuptools}
|
|
BuildRequires: %{python_module wheel}
|
|
BuildRequires: python-rpm-macros
|
|
BuildRequires: zstd
|
|
BuildRequires: pkgconfig(flatbuffers)
|
|
BuildRequires: pkgconfig(libva)
|
|
BuildRequires: pkgconfig(nlohmann_json)
|
|
BuildRequires: pkgconfig(ocl-icd)
|
|
BuildRequires: pkgconfig(protobuf)
|
|
BuildRequires: pkgconfig(pugixml)
|
|
%if %{defined isLeap15}
|
|
BuildRequires: opencl-headers
|
|
BuildRequires: snappy-devel
|
|
BuildRequires: tbb-devel
|
|
%else
|
|
BuildRequires: pkgconfig(OpenCL-Headers)
|
|
BuildRequires: pkgconfig(snappy)
|
|
BuildRequires: pkgconfig(tbb)
|
|
%endif
|
|
BuildRequires: pkgconfig(zlib)
|
|
%ifarch %{arm64}
|
|
BuildRequires: scons
|
|
%endif
|
|
# No 32-bit support
|
|
ExcludeArch: %{ix86} %{arm32} ppc
|
|
%define python_subpackage_only 1
|
|
%python_subpackages
|
|
|
|
%description
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
%package -n %{shlib}
|
|
Summary: Shared library for OpenVINO toolkit
|
|
|
|
%description -n %{shlib}
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides the shared library for OpenVINO.
|
|
|
|
%package -n %{shlib_c}
|
|
Summary: Shared C library for OpenVINO toolkit
|
|
|
|
%description -n %{shlib_c}
|
|
This package provides the C library for OpenVINO.
|
|
|
|
%package -n %{name}-devel
|
|
Summary: Headers and sources for OpenVINO toolkit
|
|
Requires: %{shlib_c} = %{version}
|
|
Requires: %{shlib} = %{version}
|
|
Requires: lib%{name}_ir_frontend%{so_ver} = %{version}
|
|
Requires: lib%{name}_onnx_frontend%{so_ver} = %{version}
|
|
Requires: lib%{name}_paddle_frontend%{so_ver} = %{version}
|
|
Requires: lib%{name}_pytorch_frontend%{so_ver} = %{version}
|
|
Requires: lib%{name}_tensorflow_frontend%{so_ver} = %{version}
|
|
Requires: lib%{name}_tensorflow_lite_frontend%{so_ver} = %{version}
|
|
Requires: pkgconfig(flatbuffers)
|
|
Requires: pkgconfig(libva)
|
|
Requires: pkgconfig(nlohmann_json)
|
|
Requires: pkgconfig(ocl-icd)
|
|
Requires: pkgconfig(protobuf)
|
|
Requires: pkgconfig(pugixml)
|
|
%if %{defined isLeap15}
|
|
Requires: opencl-headers
|
|
Requires: snappy-devel
|
|
Requires: tbb-devel
|
|
%else
|
|
Requires: pkgconfig(OpenCL-Headers)
|
|
Requires: pkgconfig(snappy)
|
|
Requires: pkgconfig(tbb)
|
|
%endif
|
|
Recommends: %{name}-auto-batch-plugin = %{version}
|
|
Recommends: %{name}-auto-plugin = %{version}
|
|
Recommends: %{name}-hetero-plugin = %{version}
|
|
Recommends: %{name}-intel-cpu-plugin = %{version}
|
|
%ifarch riscv64
|
|
Recommends: %{name}-riscv-cpu-plugin = %{version}
|
|
%endif
|
|
|
|
%description -n %{name}-devel
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides the headers and sources for developing applications with
|
|
OpenVINO.
|
|
|
|
%package -n %{name}-arm-cpu-plugin
|
|
Summary: Intel CPU plugin for OpenVINO toolkit
|
|
|
|
%description -n %{name}-arm-cpu-plugin
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides the ARM CPU plugin for OpenVINO on %{arm64} archs.
|
|
|
|
%package -n %{name}-riscv-cpu-plugin
|
|
Summary: RISC-V CPU plugin for OpenVINO toolkit
|
|
|
|
%description -n %{name}-riscv-cpu-plugin
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides the RISC-V CPU plugin for OpenVINO on riscv64 archs.
|
|
|
|
%package -n %{name}-auto-plugin
|
|
Summary: Auto / Multi software plugin for OpenVINO toolkit
|
|
|
|
%description -n %{name}-auto-plugin
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides the Auto / Multi software plugin for OpenVINO.
|
|
|
|
%package -n %{name}-auto-batch-plugin
|
|
Summary: Automatic batch software plugin for OpenVINO toolkit
|
|
|
|
%description -n %{name}-auto-batch-plugin
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides the automatic batch software plugin for OpenVINO.
|
|
|
|
%package -n %{name}-hetero-plugin
|
|
Summary: Hetero frontend for Intel OpenVINO toolkit
|
|
|
|
%description -n %{name}-hetero-plugin
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides the hetero frontend for OpenVINO.
|
|
|
|
%package -n %{name}-intel-cpu-plugin
|
|
Summary: Intel CPU plugin for OpenVINO toolkit
|
|
|
|
%description -n %{name}-intel-cpu-plugin
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides the intel CPU plugin for OpenVINO for %{x86_64} archs.
|
|
|
|
%package -n %{name}-intel-npu-plugin
|
|
Summary: Intel NPU plugin for OpenVINO toolkit
|
|
|
|
%description -n %{name}-intel-npu-plugin
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides the intel NPU plugin for OpenVINO for %{x86_64} archs.
|
|
|
|
%package -n lib%{name}_ir_frontend%{so_ver}
|
|
Summary: Paddle frontend for Intel OpenVINO toolkit
|
|
|
|
%description -n lib%{name}_ir_frontend%{so_ver}
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides the ir frontend for OpenVINO.
|
|
|
|
%package -n lib%{name}_onnx_frontend%{so_ver}
|
|
Summary: Onnx frontend for OpenVINO toolkit
|
|
|
|
%description -n lib%{name}_onnx_frontend%{so_ver}
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides the onnx frontend for OpenVINO.
|
|
|
|
%package -n lib%{name}_paddle_frontend%{so_ver}
|
|
Summary: Paddle frontend for Intel OpenVINO toolkit
|
|
|
|
%description -n lib%{name}_paddle_frontend%{so_ver}
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides the paddle frontend for OpenVINO.
|
|
|
|
%package -n lib%{name}_pytorch_frontend%{so_ver}
|
|
Summary: PyTorch frontend for OpenVINO toolkit
|
|
|
|
%description -n lib%{name}_pytorch_frontend%{so_ver}
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides the pytorch frontend for OpenVINO.
|
|
|
|
%package -n lib%{name}_tensorflow_frontend%{so_ver}
|
|
Summary: TensorFlow frontend for OpenVINO toolkit
|
|
|
|
%description -n lib%{name}_tensorflow_frontend%{so_ver}
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides the tensorflow frontend for OpenVINO.
|
|
|
|
%package -n lib%{name}_tensorflow_lite_frontend%{so_ver}
|
|
Summary: TensorFlow Lite frontend for OpenVINO toolkit
|
|
|
|
%description -n lib%{name}_tensorflow_lite_frontend%{so_ver}
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides the tensorflow-lite frontend for OpenVINO.
|
|
|
|
%package -n python-openvino
|
|
Summary: Python module for openVINO toolkit
|
|
Requires: python-numpy < 2
|
|
Requires: python-openvino-telemetry
|
|
|
|
%description -n python-openvino
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides a Python module for interfacing with openVINO toolkit.
|
|
|
|
%package -n %{name}-sample
|
|
Summary: Samples for use with OpenVINO toolkit
|
|
BuildArch: noarch
|
|
|
|
%description -n %{name}-sample
|
|
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
|
|
|
|
This package provides some samples for use with openVINO.
|
|
|
|
%prep
|
|
%autosetup -p1
|
|
|
|
%build
|
|
export CC=gcc-13 CXX=g++-13
|
|
# Otherwise intel_cpu plugin declares an executable stack
|
|
%ifarch %{x86_64}
|
|
%define build_ldflags -Wl,-z,noexecstack
|
|
%endif
|
|
%cmake \
|
|
-DCMAKE_CXX_STANDARD=17 \
|
|
-DBUILD_SHARED_LIBS=ON \
|
|
-DENABLE_OV_ONNX_FRONTEND=ON \
|
|
-DENABLE_OV_PADDLE_FRONTEND=ON \
|
|
-DENABLE_OV_PYTORCH_FRONTEND=ON \
|
|
-DENABLE_OV_IR_FRONTEND=ON \
|
|
-DENABLE_OV_TF_FRONTEND=ON \
|
|
-DENABLE_OV_TF_LITE_FRONTEND=ON \
|
|
-DENABLE_INTEL_GPU=OFF \
|
|
-DENABLE_JS=OFF \
|
|
-DENABLE_PYTHON=ON \
|
|
-DENABLE_WHEEL=OFF \
|
|
-DENABLE_SYSTEM_OPENCL=ON \
|
|
-DENABLE_SYSTEM_PROTOBUF=ON \
|
|
-DENABLE_SYSTEM_PUGIXML=ON \
|
|
-DENABLE_SYSTEM_SNAPPY=ON \
|
|
-DENABLE_SYSTEM_TBB=ON \
|
|
%if %{defined isLeap15}
|
|
-DENABLE_TBBBIND_2_5=OFF \
|
|
%endif
|
|
-DONNX_USE_PROTOBUF_SHARED_LIBS=ON \
|
|
-DProtobuf_USE_STATIC_LIBS=OFF \
|
|
%{nil}
|
|
%cmake_build
|
|
# Manually generate dist-info dir
|
|
export WHEEL_VERSION=%{version} \
|
|
BUILD_TYPE=RelWithDebInfo
|
|
%ifarch %{power64}
|
|
|
|
# RelWithDebInfo
|
|
# Manual hackery for power64 because it not "officially" supported
|
|
sed -i "s/{ARCH}/%{_arch}/" ../src/bindings/python/wheel/setup.py
|
|
%endif
|
|
%python_exec ../src/bindings/python/wheel/setup.py dist_info -o ../
|
|
|
|
%install
|
|
%cmake_install
|
|
|
|
# Hash-bangs in non-exec python sample scripts
|
|
sed -Ei "1{\@/usr/bin/env@d}" \
|
|
%{buildroot}%{_datadir}/%{prj_name}/samples/python/benchmark/bert_benchmark/bert_benchmark.py \
|
|
%{buildroot}%{_datadir}/%{prj_name}/samples/python/benchmark/sync_benchmark/sync_benchmark.py \
|
|
%{buildroot}%{_datadir}/%{prj_name}/samples/python/benchmark/throughput_benchmark/throughput_benchmark.py \
|
|
%{buildroot}%{_datadir}/%{prj_name}/samples/python/classification_sample_async/classification_sample_async.py \
|
|
%{buildroot}%{_datadir}/%{prj_name}/samples/python/hello_classification/hello_classification.py \
|
|
%{buildroot}%{_datadir}/%{prj_name}/samples/python/hello_query_device/hello_query_device.py \
|
|
%{buildroot}%{_datadir}/%{prj_name}/samples/python/hello_reshape_ssd/hello_reshape_ssd.py \
|
|
%{buildroot}%{_datadir}/%{prj_name}/samples/python/model_creation_sample/model_creation_sample.py
|
|
|
|
# Unnecessary if we get our package dependencies and lib paths right!
|
|
rm -fr %{buildroot}%{_prefix}/install_dependencies \
|
|
%{buildroot}%{_prefix}/setupvars.sh
|
|
|
|
%{python_expand rm %{buildroot}%{$python_sitearch}/requirements.txt
|
|
chmod -x %{buildroot}%{$python_sitearch}/%{name}/tools/ovc/ovc.py
|
|
cp -r %{name}-%{version}.dist-info %{buildroot}%{$python_sitearch}/
|
|
%fdupes %{buildroot}%{$python_sitearch}/%{name}/
|
|
}
|
|
|
|
%fdupes %{buildroot}%{_datadir}/
|
|
|
|
# We do not use bundled thirdparty libs
|
|
rm -fr %{buildroot}%{_datadir}/licenses/*
|
|
|
|
%ldconfig_scriptlets -n %{shlib}
|
|
%ldconfig_scriptlets -n %{shlib_c}
|
|
%ldconfig_scriptlets -n lib%{name}_ir_frontend%{so_ver}
|
|
%ldconfig_scriptlets -n lib%{name}_onnx_frontend%{so_ver}
|
|
%ldconfig_scriptlets -n lib%{name}_paddle_frontend%{so_ver}
|
|
%ldconfig_scriptlets -n lib%{name}_pytorch_frontend%{so_ver}
|
|
%ldconfig_scriptlets -n lib%{name}_tensorflow_lite_frontend%{so_ver}
|
|
%ldconfig_scriptlets -n lib%{name}_tensorflow_frontend%{so_ver}
|
|
|
|
%files -n %{shlib}
|
|
%license LICENSE
|
|
%{_libdir}/libopenvino.so.*
|
|
|
|
%files -n %{shlib_c}
|
|
%license LICENSE
|
|
%{_libdir}/libopenvino_c.so.*
|
|
|
|
%files -n %{name}-auto-batch-plugin
|
|
%dir %{_libdir}/%{prj_name}
|
|
%{_libdir}/%{prj_name}/libopenvino_auto_batch_plugin.so
|
|
|
|
%files -n %{name}-auto-plugin
|
|
%dir %{_libdir}/%{prj_name}
|
|
%{_libdir}/%{prj_name}/libopenvino_auto_plugin.so
|
|
|
|
%ifarch %{x86_64}
|
|
%files -n %{name}-intel-cpu-plugin
|
|
%dir %{_libdir}/%{prj_name}
|
|
%{_libdir}/%{prj_name}/libopenvino_intel_cpu_plugin.so
|
|
|
|
%files -n %{name}-intel-npu-plugin
|
|
%dir %{_libdir}/%{prj_name}
|
|
%{_libdir}/%{prj_name}/libopenvino_intel_npu_plugin.so
|
|
%endif
|
|
|
|
%ifarch %{arm64}
|
|
%files -n %{name}-arm-cpu-plugin
|
|
%dir %{_libdir}/%{prj_name}
|
|
%{_libdir}/%{prj_name}/libopenvino_arm_cpu_plugin.so
|
|
%endif
|
|
|
|
%ifarch riscv64
|
|
%files -n %{name}-riscv-cpu-plugin
|
|
%dir %{_libdir}/%{prj_name}
|
|
%{_libdir}/%{prj_name}/libopenvino_riscv_cpu_plugin.so
|
|
%endif
|
|
|
|
%files -n %{name}-hetero-plugin
|
|
%dir %{_libdir}/%{prj_name}
|
|
%{_libdir}/%{prj_name}/libopenvino_hetero_plugin.so
|
|
|
|
%files -n lib%{name}_onnx_frontend%{so_ver}
|
|
%{_libdir}/libopenvino_onnx_frontend.so.*
|
|
|
|
%files -n lib%{name}_ir_frontend%{so_ver}
|
|
%{_libdir}/libopenvino_ir_frontend.so.*
|
|
|
|
%files -n lib%{name}_paddle_frontend%{so_ver}
|
|
%{_libdir}/libopenvino_paddle_frontend.so.*
|
|
|
|
%files -n lib%{name}_pytorch_frontend%{so_ver}
|
|
%{_libdir}/libopenvino_pytorch_frontend.so.*
|
|
|
|
%files -n lib%{name}_tensorflow_frontend%{so_ver}
|
|
%{_libdir}/libopenvino_tensorflow_frontend.so.*
|
|
|
|
%files -n lib%{name}_tensorflow_lite_frontend%{so_ver}
|
|
%{_libdir}/libopenvino_tensorflow_lite_frontend.so.*
|
|
|
|
%files -n %{name}-sample
|
|
%license LICENSE
|
|
%{_datadir}/%{prj_name}/
|
|
|
|
%files -n %{name}-devel
|
|
%license LICENSE
|
|
%{_includedir}/%{name}/
|
|
%{_libdir}/cmake/%{prj_name}/
|
|
%{_libdir}/*.so
|
|
%{_libdir}/pkgconfig/openvino.pc
|
|
|
|
%files %{python_files openvino}
|
|
%license LICENSE
|
|
%{python_sitearch}/openvino/
|
|
%{python_sitearch}/openvino*-info/
|
|
|
|
%changelog
|