Logo
Sign in
Product Logo
NVIDIA MLPerf InferenceNVIDIA

Benchmark suite for measuring how fast systems can run models in various deployment scenarios.

Vendor

Vendor

NVIDIA

Company Website

Company Website

Product details

MLPerf Inference is a benchmark suite designed to measure the performance of systems running machine learning models in various deployment scenarios. The NVIDIA MLPerf Inference containers provide the base environment for reproducing NVIDIA's MLPerf Inference submission results. These containers are optimized for benchmarking purposes and should not be used in production environments.

Features

  • Benchmarking Suite: Measures the performance of systems running ML models.
  • NVIDIA-Optimized Implementations: Includes NVIDIA's optimized implementations for accurate benchmarking.
  • Base Containers: Provides the necessary environment to reproduce NVIDIA's MLPerf Inference results.
  • Security: Signed images and comprehensive security scanning.
  • Multi-Arch Support: Compatible with Linux/amd64 architecture.

Benefits

  • Performance Measurement: Enables accurate measurement of system performance for ML model inference.
  • Reproducibility: Allows users to reproduce NVIDIA's leading MLPerf Inference results.
  • Optimization: Utilizes NVIDIA's optimized implementations for benchmarking.
  • Security: Ensures secure benchmarking with signed images and thorough security scans.
Find more products by category
Development SoftwareView all