Commit Graph

3 Commits

Author SHA256 Message Date
Ana Guerrero
37c4e1817a Accepting request 1235692 from science:machinelearning
OBS-URL: https://build.opensuse.org/request/show/1235692
OBS-URL: https://build.opensuse.org/package/show/openSUSE:Factory/openvino?expand=0&rev=8
2025-01-09 14:06:03 +00:00
6f04946ce8 - openvino-onnx-ml-defines.patch and
openvino-remove-npu-compile-tool.patchhas been removed 
  as it is no longer needed in this version. 
- Update to 2024.4.0
- Summary of major features and improvements  
  * OpenVINO 2024.6 release includes updates for enhanced 
    stability and improved LLM performance.
  * Introduced support for Intel® Arc™ B-Series Graphics 
    (formerly known as Battlemage).
  * Implemented optimizations to improve the inference time and 
    LLM performance on NPUs.
  * Improved LLM performance with GenAI API optimizations and 
    bug fixes.
- Support Change and Deprecation Notices
  * Using deprecated features and components is not advised. They
    are available to enable a smooth transition to new solutions 
    and will be discontinued in the future. To keep using 
    discontinued features, you will have to revert to the last 
    LTS OpenVINO version supporting them. For more details, refer
    to the OpenVINO Legacy Features and Components page.
  * Discontinued in 2024.0:
    + Runtime components:
      - Intel® Gaussian & Neural Accelerator (Intel® GNA)..
        Consider using the Neural Processing Unit (NPU) for 
        low-powered systems like Intel® Core™ Ultra or 14th 
        generation and beyond.
      - OpenVINO C++/C/Python 1.0 APIs (see 2023.3 API transition
        guide for reference).
      - All ONNX Frontend legacy API (known as ONNX_IMPORTER_API)
      - 'PerfomanceMode.UNDEFINED' property as part of the 
        OpenVINO Python API
    + Tools:
      - Deployment Manager. See installation and deployment 
        guides for current distribution options.
      - Accuracy Checker.
      - Post-Training Optimization Tool (POT). Neural Network
        Compression Framework (NNCF) should be used instead.
      - A Git patch for NNCF integration with 
        huggingface/transformers. The recommended approach is
        to use huggingface/optimum-intel for applying NNCF
        optimization on top of models from Hugging Face.
      - Support for Apache MXNet, Caffe, and Kaldi model formats.
        Conversion to ONNX may be used as a solution.
  * Deprecated and to be removed in the future:
    + The macOS x86_64 debug bins will no longer be provided 
      with the OpenVINO toolkit, starting with OpenVINO 2024.5.
    + Python 3.8 is no longer supported, starting with 
      OpenVINO 2024.5.
    + As MxNet doesn’t support Python version higher than 3.8,
      according to the MxNet PyPI project, it is no longer 
      supported by OpenVINO, either.
    + Discrete Keem Bay support is no longer supported, starting
      with OpenVINO 2024.5.
    + Support for discrete devices (formerly codenamed Raptor 
      Lake) is no longer available for NPU.

OBS-URL: https://build.opensuse.org/package/show/science:machinelearning/openvino?expand=0&rev=23
2025-01-07 17:14:03 +00:00
799becee5b Accepting request 1169921 from science
OpenVINO is an open-source toolkit for optimizing and deploying AI inference developed by Intel

Superseded after specfile updates

OBS-URL: https://build.opensuse.org/request/show/1169921
OBS-URL: https://build.opensuse.org/package/show/science:machinelearning/openvino?expand=0&rev=1
2024-05-06 10:14:29 +00:00