Logo
Sign in
Product Logo
AI Toolkit for IBM Z and LinuxONEIBM

AI Toolkit for IBM Z and LinuxONE is a family of popular open-source AI frameworks with IBM Elite Support and adapted for IBM Z and IBM LinuxONE hardware.

Vendor

Vendor

IBM

Company Website

Company Website

Unsupported media type
Product details

AI Toolkit for IBM Z® and LinuxONE is a family of popular open source AI frameworks with IBM® Elite Support and adapted for IBM Z and LinuxONE hardware.

While open source software has made AI more accessible, affordable and innovative, you need the right level of support to successfully implement these frameworks. With the introduction of an AI Toolkit for IBM Z and LinuxONE, you can use our verified support offering to deploy and accelerate the adoption of popular open source AI frameworks on your z/OS® and LinuxONE platforms.

The AI Toolkit consists of IBM Elite Support and IBM Secure Engineering, which vet and scan open source AI serving frameworks and IBM-certified containers for security vulnerabilities and validate compliance with industry regulations.

Features

  • **Free download of AI frameworks: **Reduce costs and complexity while accelerating time to market with lightweight and free to download tools and runtime packages.
  • **TensorFlow compatible: **Accelerate seamless integration of TensorFlow with IBM Z Accelerated for TensorFlow to develop and deploy machine learning (ML) models on neural networks.
  • **AI frameworks integration: **Use IBM Z Accelerated for NVIDIA Triton Inference Server to streamline and standardize AI inferences by deploying ML or DL models from any framework on any GPU- or CPU-based infrastructure.
  • **ML models with TensorFlow Serving: **Harness the benefits of TensorFlow Serving, a flexible and high-performance service system, with IBM Z Accelerated for TensorFlow Serving to help deploy ML models in production.
  • **Compile models with IBM zDLC: **Convert ML models into a code that can be run on z/OS or LinuxONE with the help of IBM Z Deep Learning Compiler (IBM zDLC).
  • **Run Snap ML: **Use IBM Z Accelerated for Snap ML to build and deploy ML models with Snap ML, an IBM non-warranted program that optimizes the training and scoring of popular ML models.

Benefits

  • **Deploy with confidence: **Use premium support offered by IBM Elite Support to get expert guidance when you need it for successfully deploying open source AI and IBM non-warranted software.
  • **Improve performance: **Use IBM Z Integrated Accelerator for AI to significantly improve the performance of open source and IBM non-warranted AI programs.
  • **Use AI frameworks: **Capitalize on both deep learning and traditional ML approaches to create and serve AI frameworks.

Deliver innovation through open source with the AI Toolkit for IBM Z and LinuxONE.

  • **Expedite fraud detection: **Digital currency transactions run inferencing for fraud 85% faster by colocating your application with Snap ML on IBM LinuxONE Emperor 4.
  • **Enhance biomedical image throughput: **With IBM z16™ single frame, using the Integrated Accelerator for AI provides 6.8X more throughput for inferencing on biomedical image data with TensorFlow 2.9.1 compared to using IBM z16 single frame alone.
  • **Boost biomedical image inferencing: **With IBM z16 multi frame and LinuxONE Emperor 4, using the Integrated Accelerator for AI provides 2.5x more throughput for inferencing on biomedical image data with TensorFlow serving versus on compared x86 system.
  • **Lower fraud response times: **Run credit card fraud detection with 7x lower response times using the ONNX-MLIR backend for Nvidia Triton on IBM z16 multi frame and LinuxONE Emperor 4 versus using the ONNX Runtime backend for Nvidia Triton on a compared x86 server.
  • **Accelerate customer transaction prediction: **Run prediction of customer transactions 3.5x faster by colocating your application with the Snap ML library on IBM z16 multi frame and LinuxONE Emperor 4 versus running prediction remotely using the NVIDIA Forest Inference Library on a compared x86 server.