NVIDIA OVX Validation for Pure Storage AI-ready Infrastructures

Pure Storage has achieved storage partner validation for NVIDIA OVX servers powered by NVIDIA L40S GPUs and FlashBlade//S. Learn more about the validation and our AI-ready infrastructure solutions.

OVX Validated Storage Partner

Summary

Pure Storage has achieved storage partner validation for NVIDIA OVX servers. The addition of OVX reference architectures to AIRI and FlashStack helps make Pure Storage the go-to provider for enterprise AI-ready infrastructure solutions.

image_pdfimage_print

At NVIDIA GTC 2024, Pure Storage announced its close collaboration with NVIDIA to achieve storage partner validation for NVIDIA OVX servers powered by NVIDIA L40S GPUs and FlashBlade//S™. This new reference architecture validation provides enterprises with greater GPU server choice and more immediate availability of proven AI infrastructure for fast and efficient small model training, fine-tuning, and inference workloads.

The addition of Pure Storage’s validated NVIDIA OVX reference architectures to our AIRI® NVIDIA DGX BasePOD validation and new FlashStack® for AI solutions makes Pure Storage the go-to provider for enterprise AI-ready infrastructure solutions.

What is NVIDIA OVX Validation?

NVIDIA OVX is an advanced computing environment engineered to run Omniverse applications, however the certification can be applied to other infrastructure to indicate its performance and compatibility with AI applications. The OVX certification indicates that AI infrastructure, such as data storage, is optimized for demanding AI workloads, meeting the rigorous standards required by OVX servers, including reliability, performance, and compatibility.

Through close collaboration, we designed these solutions specifically to integrate with OVX servers powered by NVIDIA L40s GPUs, for an infrastructure choice enterprises can count on for demanding AI workloads such as generative AI, real-time simulations, and 3D.

How OVX Complements Validated Enterprise Infrastructure for GenAI RAG

Pure Storage launched the first and original AIRI in 2018 and we’ve since provided customers with the introduction of FlashBlade//S (the storage foundation for AIRI) and Cisco Validated Designs with FlashStack for AI.

Our goal: to give customers the best options for full-stack, ready-to-run AI infrastructure.

The demand for new AI workloads like generative AI and retrieval-augmented generation (RAG) to customize large language models (LLMs) for specific domain or company needs has elevated the demand for efficient, powerful AI compute, storage, and networking solutions.

These new AI use cases require high-performance and highly efficient all-flash storage to maximize GPU utilization and AI productivity. Pure Storage’s data storage platform for AI ensures that multimodal performance addresses the common payload in an AI data pipeline with performance and capacity-optimized DirectFlash® technology. Our products require 80% less energy than alternatives and that shifts availability of power for more GPUs per rack.

With the Pure Storage Platform for AI, customers are able to:

  1. Accelerate model training and inferencing
  2. Maximize operational efficiency
  3. Deliver cost and energy efficiency at scale
  4. Achieve ultimate reliability and future-proof AI storage

MLOps and Vertical Full-stack Solutions

The difficulty in getting AI initiatives off the ground spans multiple domains. Fundamental challenges can be due to:

  • Legacy compute systems
  • Lack of skilled people
  • Limited budgets
  • Silos of legacy data storage that don’t keep powerful AI-optimized GPUs fully productive

Domain expertise beyond infrastructure and storage is needed around the use and deployment of MLOps applications from providers such as Red Hat OpenShift, Weights & Biases, Run:ai, Ray, and Anyscale, which integrate with NVIDIA AI Enterprise software, including NVIDIA NeMo and new NVIDIA NIM and NeMo Retriever microservices.

Some industries require vertical-specific applications, including financial services applications like KDB.AI, Raft, and Milvus. The healthcare and life sciences stack can include applications from NVIDIA Clara like MONAI for medical imaging.

New validated full-stack solutions, including MLOps and vertical stacks from Pure Storage, help fast-track IT and AI infrastructure teams with solutions that have gone through thorough testing, documentation, and integration so that AI development and data science teams can be off and running faster and with less risk.

Future-proof Storage for AI Uncertainty and Growth

The Pure Storage platform for AI is future-proof. Companies get an AI infrastructure that grows over time without the fear of storage that’s out of date or can’t keep up with AI’s and data science teams’ needs. The Pure Storage subscription cloud model called Evergreen//One™ makes consumption of storage for AI more cloud-like, but with all of the advantages of on-premises: continuous innovation, financial flexibility, and operational agility.

Learn more about how Pure Storage can help you accelerate adoption of AI as well as the benefits that AI can bring to your business.

Customer Journeys to AI Success
Logo - Pure Storage - White - Cropped

Written By: