DDN announced the latest addition to its powerful A³I solutions, the DDN AI400X2 Turbo. 30% more powerful than the AI400X2, the previous industry performance leader, the AI400X2 Turbo boasts faster performance and expanded connectivity options.
As AI workloads multiply across markets, GPU manufacturers continue innovating and developing faster accelerators to handle these workloads. It is critically important that all data centre infrastructure elements can fully facilitate their processing power. DDN’s AI400X2 Turbo delivers much better ROI for multi-node GPU clusters and Generative AI, Inference, AI frameworks and software libraries, with a staggering 120 GB/s write speeds and 75 GB/s read speeds per 2U appliance.
“With data centres and cloud providers making massive investments in AI infrastructures, data storage is a key enabler in accelerating ROI, increasing the efficiency of AI frameworks and software libraries, and delivering the highest performance to GPUs,” said Dr James Coomer, senior vice president of products, DDN. “DDN’s AI400X2 Turbo was designed to deliver the highest efficiency, performance and ideal power and simplicity for Gen AI, inference and multi-node GPU clusters, reinforcing DDN’s position as the top choice for large-scale generative AI and large language models.”
The AI400X2 Turbo joins the lineup of A³I appliances deployed today and power NVIDIA DGX systems globally across a wide range of production environments in the financial services, life sciences, healthcare, and autonomous vehicle industries. DDN leads and accelerates safe and power-efficient AI adoption with cutting-edge storage innovation by building solutions that optimise application and AI framework performance.