Logo
Sign in
Product Logo
Vitis AIAMD

An optimized AI inference development platform for AMD adaptable SoCs, FPGAs and Alveo accelerators.

2722406-vitis-ai-developer-cover.avif
2547222-amd-versal-og.avif
Product details

Overview

AMD Vitis™ AI is an AI inference development platform for AMD adaptable SoCs, FPGAs and Alveo data-center acceleration cards. It provides optimized model libraries, an efficient neural processing/DPU core, toolchains for compiling and quantizing neural networks, deployment libraries and example designs to accelerate inference on edge and data-center targets. The toolset is distributed via containers and accompanied by documentation, model zoo examples and developer resources to help prototype, optimize and deploy AI workloads .

Features and Capabilities

  • Core Components: Vitis AI combines model development tools, optimized DPU/NPU IP, runtime libraries, and sample applications to support complete inference workflows on AMD targets.
  • Model Optimization & Compilation: Tools for model quantization, pruning, and compilation that convert common frameworks (TensorFlow, PyTorch, ONNX) to device-optimized formats for efficient inference on DPU cores.
  • Deployment Targets: Support for AMD adaptable SoCs, evaluation boards (e.g., KV260, VCK190), Alveo accelerator cards and other FPGA/SoC platforms for edge and data-center use cases.
  • Containerized Distribution: Releases are provided as containers (tools container, examples, model zoo), simplifying reproducible builds and deployment across hosts and CI workflows.
  • Model Zoo & Examples: A curated model zoo and end-to-end examples (image classification, detection, segmentation, video analytics) accelerate PoC development and benchmarking.
  • DPU/IP & Integration: Provides DPU IP cores and system-integration guides for embedding the hardware accelerator into custom designs, plus APIs and runtime glue for application integration.
  • Documentation & Developer Hub: Comprehensive docs, tutorials and developer hub resources to guide installation, board images, running library examples and optimization workflows.
  • Open Source & Community: Sources, code samples and toolchain components are available to support collaboration and extension.