IT Brief Australia - Technology news for CIOs & IT decision-makers
Story image

Intel's alternative AI solutions cater to surging demand amidst complexity growth

Fri, 8th Dec 2023

Intel continues to deliver choice with their open artificial intelligence (AI) solutions while keeping GenAI compute costs manageable. As AI evolves and becomes more complex, companies are finding ways to meet the growing demand for AI computation.

Speed, scale, sustainability and security are critical for the successful production of GenAI solutions, according to Intel's observations from the field. In September at the Intel Innovation event, it was announced that Stability AI, creators of the Stable Diffusion text-to-image model, would be one of the largest customers for an AI supercomputer constructed entirely with Intel technology, including the Intel Gaudi2 AI hardware accelerators and Intel Xeon processors.

Stability AI, like many other companies, is seeking alternative AI computation solutions as AI workloads increase in complexity. Intel's track record of collaborating with the open ecosystem and independent software vendors to scale technologies provides developers with more flexibility and compatibility. The goal to manage computation costs without performance compromise is a key driver behind companies, including Stability AI, opting for Intel's proven alternative solutions.

Research suggests that many companies grapple with the challenges of deploying GenAI, a type of AI that "generates" new content based on queries of existing data, in production. According to cnvrg.io, about 10% of firms currently experimenting with GenAI are pushing these workflows into production. However, many organisations struggle to scale their solution beyond the pilot or proof-of-concept phase and make it operational in a production environment where it can generate maximum business benefit.

Transitioning from proof-of-concept to production requires the four S's: speed, scale, sustainable cost and security. Balancing these aspects can be challenging and sometimes impossible. Certain organisations resort to inexpensive but hard-to-scale APIs for speed, while others attempt to build large language models (LLMs) in pursuit of scale, a task requiring investment in terms of time, work, and expertise. However, the recent partnership options with the AI community enable firms to customise open solutions for their needs, speeding up GenAI projects considerably, and potentially saving them from building systems from the ground up.

On the cost front, "pay as you go" pricing models from some providers lure many organisations. These projects start off as being inexpensive but soon become costly. For instance, if a company moves from a proof of concept to a production environment, the cost can quickly rise to millions of dollars. A yearly contract for an enterprise license with another provider may initially seem expensive, but that price won't change when transitioning from pilot to production, leading to substantial savings over time.

Before proceeding to production, a crucial step for businesses is to understand how their data is secured, its handling process, especially when working with a partner, and ensuring compliance with commitments made to customers and any governing regulations. Arun Subramaniyan, leader of the Data Center & AI Cloud Execution and Strategy team at Intel Corporation, likens the current AI momentum to the advent of the internet in 1996. The exploitation of AI's potential has started en masse albeit with an unclear understanding of its long-term implications and opportunities.

Arun added, "We are working to bring AI everywhere – to new use cases, industries, devices, people and more." On December 14, Intel plans to showcase many more customer successes with GenAI products into production at their AI Everywhere launch event. Businesses that can balance speed, scale, sustainable cost and security without giving up any of them are poised to be at the forefront of the generative AI revolution.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X