The Flash Storage-AI Connection, Explained

We’ve hit an AI inflection point. How will flash storage play a role in AI’s future? Charlie Giancarlo sits down with The Six Five – On the Road to explain.

Flash Storage-AI Connection

5 minutes

Summary

Data storage plays a critical role in AI’s interlocking pipeline of technologies. But not just any storage will do. When it comes to performance and savings on space, power, and cooling, flash storage has disk beat.

image_pdfimage_print

After this year’s NVIDIA GTC in San Jose, CA, Pure Storage CEO Charlie Giancarlo sat down with Patrick Moorhead and Daniel Newman of The Six Five – On the Road to talk about artificial intelligence (AI)—its demands on data, its impact on the environment, and why flash storage will be key to its future. 

If you’ve ever wondered, “What does data storage have to do with AI?” read on to find out.

AI will take a village

“It’s a story that’s been building—the story of AI and the future. [AI] takes a village: GPUs, networking, and storage.”

You’d be forgiven for thinking GPUs are all there is to AI, but in reality, AI is a complex pipeline of interlocked technologies, including networking, data storage, memory, accelerators, models, tools, and algorithms to name a few. Not to mention, AI depends on data, which makes storage a critical component.

Over AI’s growth trajectory, Giancarlo notes that the nature of that storage has increasingly come into question. “Seven years ago, everything was going to the cloud,” Giancarlo said. “Storage, in particular, was headed for white boxes, open source code, fully commoditized. All the major vendors viewed it that way—and stopped investing [in storage.]”

However, what Giancarlo—and Pure Storage technology partners like NVIDIA—realized was that you can’t push the performance envelope of networking and compute in the data center without addressing storage at the same time.

“If we believe that AI is going to continue to change everyone’s lives, enterprise [and hyperscale] data centers—whether in the cloud or hybrid—need to see advances in storage, as well.”

AI’s demands on data could be the biggest we’ve ever seen

Why is the highest end of storage so important to AI—and why won’t hard disk systems cut it? It’s a problem of volume, velocity, and performance that legacy systems simply aren’t equipped to solve. 

The more data an AI model has access to, the more it learns. There’s more data in the world than ever before, but that doesn’t mean it’s all accessible.

“A majority of this data is on hard disk,” Giancarlo says, “and hard disk systems just barely have the performance necessary for whatever applications they support. Because of this, if you want to leverage that data for AI, you have to copy it and export it to something more performant.”

If that data resides on flash systems from the start—systems that offer four to five times the performance at a similar price to disk—AI deployments will get a serious leg up.

AI could pose a big energy problem… on disk

At Davos this year, OpenAI CEO Sam Altman warned AI’s future will depend on a real breakthrough in energy efficiency as it continues to consume more and more energy. However, he also shared a potential solution: climate-friendly storage.

It’s another way flash storage blows disk out of the water.

As you start adding GPUs, you need a lot of power and cooling, and data centers tend to be limited in power and cooling. Data centers aren’t sold by the square foot anymore; they’re sold by the megawatt,” Giancarlo noted. “If you’re pressing up against the limits of your power envelope in your data center… you’re stuck, [or you’re] forced to expand to another data center or bring in more power, you’re talking years of effort and activity and millions of dollars.”

If we’re looking at doubling the data center power draw in two years, Pure Storage can offer four to five times the performance at a similar price to disk systems yet requires one-tenth the space, power, and cooling. That extra power saved can be put toward GPUs to maximize AI workflows.

“Whether it’s power or performance, flash has it all over hard disk.”

Banner CTA - Top Storage Recommendations
to Support Generative AI

RAG from NVIDIA and Pure Storage: “An opportunity to raise the level”

Further down the AI pipeline, projects such as large language models are running into other issues: data availability, accuracy, and variety that can lead to hallucinations, lack of timely information, and irrelevancy of insights. 

As it turns out, storage can be a boon here as well.

At NVIDIA GTC, Pure Storage and NVIDIA demoed a brand-new solution for retrieval-augmented generation in which flash storage plays a critical role.

“In RAG, you want to access a large fraction (or all) of the data in an enterprise. That means you have to be able to access it.”

Giancarlo went on to explain why this can be so hard to do, without the power of Pure Storage.

On disk systems, “The data isn’t networked. It’s largely hidden behind applications. It’s not a first-class citizen. To copy it, you have to do it through the application,” such as ERP platforms. Then, the performance level is missing for RAG. “[The data] is just performant enough for the application it’s behind,” but not nearly enough for it to be used in generative AI applications.

On Pure Storage, the data is networked. “Our systems all operate on the same operating environment: Purity. We can leverage Purity via Fusion to ‘network’ the data storage,” allowing a single pool of data to be accessible to AI applications, even when that data is also supporting primary applications—ranging all the way from AI to archive. 

Watch the video for the full interview and to discover how Pure Storage is enabling customers to accelerate their adoption of AI with a data storage platform built with one fully integrated, consistent operating environment for all stages of the AI pipeline. 

Written By: