openvino/_service
Guillaume GARDET 6f04946ce8 - openvino-onnx-ml-defines.patch and
openvino-remove-npu-compile-tool.patchhas been removed 
  as it is no longer needed in this version. 
- Update to 2024.4.0
- Summary of major features and improvements  
  * OpenVINO 2024.6 release includes updates for enhanced 
    stability and improved LLM performance.
  * Introduced support for Intel® Arc™ B-Series Graphics 
    (formerly known as Battlemage).
  * Implemented optimizations to improve the inference time and 
    LLM performance on NPUs.
  * Improved LLM performance with GenAI API optimizations and 
    bug fixes.
- Support Change and Deprecation Notices
  * Using deprecated features and components is not advised. They
    are available to enable a smooth transition to new solutions 
    and will be discontinued in the future. To keep using 
    discontinued features, you will have to revert to the last 
    LTS OpenVINO version supporting them. For more details, refer
    to the OpenVINO Legacy Features and Components page.
  * Discontinued in 2024.0:
    + Runtime components:
      - Intel® Gaussian & Neural Accelerator (Intel® GNA)..
        Consider using the Neural Processing Unit (NPU) for 
        low-powered systems like Intel® Core™ Ultra or 14th 
        generation and beyond.
      - OpenVINO C++/C/Python 1.0 APIs (see 2023.3 API transition
        guide for reference).
      - All ONNX Frontend legacy API (known as ONNX_IMPORTER_API)
      - 'PerfomanceMode.UNDEFINED' property as part of the 
        OpenVINO Python API
    + Tools:
      - Deployment Manager. See installation and deployment 
        guides for current distribution options.
      - Accuracy Checker.
      - Post-Training Optimization Tool (POT). Neural Network
        Compression Framework (NNCF) should be used instead.
      - A Git patch for NNCF integration with 
        huggingface/transformers. The recommended approach is
        to use huggingface/optimum-intel for applying NNCF
        optimization on top of models from Hugging Face.
      - Support for Apache MXNet, Caffe, and Kaldi model formats.
        Conversion to ONNX may be used as a solution.
  * Deprecated and to be removed in the future:
    + The macOS x86_64 debug bins will no longer be provided 
      with the OpenVINO toolkit, starting with OpenVINO 2024.5.
    + Python 3.8 is no longer supported, starting with 
      OpenVINO 2024.5.
    + As MxNet doesn’t support Python version higher than 3.8,
      according to the MxNet PyPI project, it is no longer 
      supported by OpenVINO, either.
    + Discrete Keem Bay support is no longer supported, starting
      with OpenVINO 2024.5.
    + Support for discrete devices (formerly codenamed Raptor 
      Lake) is no longer available for NPU.

OBS-URL: https://build.opensuse.org/package/show/science:machinelearning/openvino?expand=0&rev=23
2025-01-07 17:14:03 +00:00

17 lines
585 B
Plaintext

<services>
<service name="obs_scm" mode="manual">
<param name="url">https://github.com/openvinotoolkit/openvino.git</param>
<param name="scm">git</param>
<param name="revision">2024.6.0</param>
<param name="version">2024.6.0</param>
<param name="submodules">enable</param>
<param name="filename">openvino</param>
<param name="exclude">.git</param>
</service>
<service name="tar" mode="buildtime" />
<service name="recompress" mode="buildtime">
<param name="file">*.tar</param>
<param name="compression">zstd</param>
</service>
</services>