
NVIDIA Magnum IO GPUDirect Storage (GDS)NVIDIA
NVIDIA GPUDirect Storage creates a direct data path between storage and GPU memory, enhancing application performance by reducing CPU load.
Vendor
NVIDIA
Company Website
tech-overview-m…-1790750-r5-web.pdf
Product details
NVIDIA Magnum IO GPUDirect Storage (GDS) is a technology that creates a direct data path between local or remote storage, such as NVMe or NVMe over Fabrics (NVMe-oF), and GPU memory. By enabling a direct-memory access (DMA) engine near the network adapter or storage, GDS moves data into or out of GPU memory without burdening the CPU. This technology is designed to enhance application performance by reducing the time spent loading data, making it ideal for large datasets.
Features
- Direct Data Path: Establishes a direct data path between storage and GPU memory, avoiding extra copies through a bounce buffer in the CPU’s memory.
- DMA Engine: Utilizes a DMA engine near the network adapter or storage to move data efficiently.
- Support for NVMe P2PDMA: Eliminates the need for custom MOFED patches and nvidia-fs.ko driver.
- Amazon FSx for Lustre: Added support for Amazon FSx for Lustre.
- Improved IO Performance: Enhances unregistered buffer IO performance.
- P2P Mode: Enabled for DDN EXAScaler on Grace Hopper system.
Benefits
- Enhanced Performance: Reduces the time spent loading data, improving overall application performance.
- Reduced CPU Load: Offloads data movement tasks from the CPU, freeing up resources for other operations.
- Scalability: Supports large datasets, making it suitable for data-intensive applications.
- Cost Efficiency: Reduces the need for extensive CPU resources, leading to cost savings.