IT Brief Australia - Technology news for CIOs & IT decision-makers
Story image
Pure Storage & NVIDIA unveil AI-ready validated reference architectures
Tue, 19th Mar 2024

Pure Storage has unveiled new reference architectures based on the NVIDIA OVX platform for generating Artificial Intelligence (AI). This launch, in partnership with NVIDIA, empowers global customers with a framework for handling high-performance data, thereby facilitating successful AI deployment.

Building on their ongoing partnership, Pure Storage intends to address the exponentially increasing demand for AI through these new validated designs and proofs of concept. Among the innovations introduced is the Retrieval-Augmented Generation (RAG) Pipeline for AI Inference, which is aimed at enhancing the precision, timeliness, and relevance of Inference abilities for large language models (LLMs). This is achieved by using the RAG pipeline, which includes NVIDIA NeMo Retriever microservices and GPUs and Pure Storage for all-flash enterprise storage. Consequently, businesses can expeditiously gain insights using their internal data for AI training, dispensing with the need for frequent retraining of LLMs.

The vendor's validated NVIDIA OVX Server Storage Reference Architecture provides its enterprise customers and channel partners with adaptable storage reference architectures. These have been authenticated against significant benchmarks, thereby ensuring a robust infrastructure base for cost and performance-optimised AI hardware and software solutions. The validation further expands options for AI customers and supplements Pure Storage’s certification for the NVIDIA DGX BasePOD, which was revealed last year.

It's also instigating vertical RAGs with NVIDIA to speed up the adoption of AI across different industries. Initially, they have developed a financial services RAG solution that allows faster insights from various financial documents and other sources by employing AI to create immediate summaries and analyses. The company plans to release additional RAGs for the healthcare and public sectors.

Pure Storage is also deepening its investment in its AI partner ecosystem with NVIDIA. They are currently engaged in new partnerships with software vendors like Run.AI and Weights & Biases. Run.AI optimises GPU utilisation via advanced orchestration and scheduling, while Weights & Biases' AI Development platform enables ML teams to supervise the model development lifecycle. Pure Storage is also collaborating with AI-focused resellers and service partners like ePlus, Insight and WWT to further streamline joint AI deployments for customers.

"Pure Storage identified the escalating demand for AI early on and delivers an effective, trustworthy, and high-performing platform for the most advanced AI deployments," said Rob Lee, Chief Technology Officer of Pure Storage. "Our collaboration with NVIDIA has led to the latest validated AI reference architectures and generative AI proofs of concept, which are key elements for global enterprises in tackling the complexities of the AI puzzle," he added.

Mike Leone, Principal Analyst at ESG, acknowledged the benefit of Pure Storage's NVIDIA-validated reference architectures and proofs of concept, stating, "enterprises across various fields have a shortcut to AI success. Rather than spending precious resources on creating an AI architecture from the get-go, Pure's tried-and-true frameworks not only prevent the risk of expensive project delays but also guarantee a high return on investment for AI team expenditures like GPUs."