IT Brief Australia - Technology news for CIOs & IT decision-makers
Story image

Artificial Intelligence driving data centres to the edge

Today

In the past year, artificial intelligence (AI) has surged forward like a digital renaissance, echoing the rapid and transformative rise of the Internet in the late 1990s. It has revolutionised industries and redefined our daily lives with extraordinary speed – and its impact is set to grow even more substantially in the coming years. Investments in generative AI reached US$25.2 billion in 2023, nearly nine times the amount invested in 2022 and approximately 20 times the funding seen in 2019.

This rapid growth presents data centre companies with opportunities to innovate, expand their service offerings, and cater to the evolving needs of AI-driven applications and enterprises. By embracing AI technologies and adapting their infrastructure and operations accordingly, data centres play a crucial role in enabling the broader adoption and success of AI across various sectors.

However, the integration of AI comes with its own set of challenges. AI currently requires 4.3GW of data centre power, projected to reach up to 18GW by 2028. This surge surpasses current data centre power demand growth rates, presenting capacity and sustainability challenges. AI requires data centres not just to expand but to fundamentally transform their architecture, including specialised IT infrastructure, power, and cooling systems.

Powering sustainable AI data centres
AI workloads are expected to grow two to three times faster than legacy data centre workloads, representing 15 to 20% of all data centre capacity by 2028.  More workloads will also start moving closer to users at the edge to reduce latency and enhance performance.

Training large language models often necessitates thousands of graphics processing units (GPUs) working in unison. In large AI clusters, the cluster size can range from 1 MW to 2 MW, with rack densities from 25 kW to 120 kW, depending on the GPU model and quantity. These characteristics significantly impact rack power density, presenting substantial infrastructure challenges for data centres.  Currently, most data centres can only support rack power densities of about 10 to 20 kW.

Data centres must adapt to meet the evolving power needs of AI-driven applications effectively and sustainably, so optimising physical infrastructure to meet AI requirements is crucial. Transitioning from low-density to high-density configurations can help address these challenges. Collaborations with technology providers like NVIDIA, the most recent Executive brief conducted by both companies emphasises the critical role of reference designs in expediting the deployment of high-density AI clusters in data centres,  enabling advancements in edge AI and digital twin technologies. Retrofit reference designs for adding AI clusters into existing facilities, and new-build designs specifically tailored for accelerated computing clusters, can support various applications, including data processing, engineering simulation, electronic design automation, and generative AI.
By addressing the evolving demands of AI workloads, these reference designs will provide a robust framework for integrating NVIDIA's accelerated computing platform into data centres, enhancing performance, scalability, and sustainability.

Keeping AI data centres cool
AI data centres generate substantial heat, necessitating the use of liquid cooling to ensure optimal performance, sustainability, and reliability. Cooling systems, aside from IT infrastructure, rank as the second-largest energy consumers in data centres. In less densely utilised traditional data centres and distributed IT locations, cooling can account for 20 to 40% of the facility's total energy consumption.

Liquid cooling offers many benefits, including higher energy efficiency, smaller footprint, lower total cost of ownership (TCO), enhanced server reliability, and lower noise levels.
As the demand for AI processing power grows and thermal loads increase, liquid cooling becomes a critical element in data centre design. Adopting liquid cooling solutions can cover various needs, from white space solutions to heat rejection strategies. Resources like white papers on liquid cooling architectures can help data centre companies navigate the intricacies of system design, implementation, and operational considerations.

AI and data centre evolution for a sustainable future
AI has the potential to optimise energy usage, yet it also raises concerns about increased energy consumption. Accelerated computing, which drives the AI revolution, can enable us to achieve more with fewer resources in data centre infrastructure.

However, it's crucial to evaluate AI's broader impact on energy consumption and the environment. Gartner reveals that 80% of CIOs will have performance metrics tied to the sustainability of the IT organisation by 2027.

According to the Sustainability Index, 2024, nearly one in 10 business decision-makers around Australia are already using AI as a resource for decarbonisation transformation. Combining AI with real-time monitoring can turn data into actionable insights for improved sustainability. Studies indicate that advanced energy management capabilities can lead to significant savings on utility expenses by optimising power usage and cooling efficiencies.

Data centres operate with significant energy demands, posing challenges to environmental sustainability. Optimising energy efficiency, lowering carbon emissions, and enhancing operational resilience, are essential to enable data centres to operate responsibly, fostering a more sustainable future.

The demand for AI and the evolution of the data centre are interconnected elements shaping the digital landscape. Increased workloads, especially deep learning AI models, require significant computing resources to train. This requires data centres that can support the performance requirements of AI workloads.

As AI technology advances, it will continue to influence the design and operation of data centres. While these advancements bring efficiency and innovation, they also pose challenges related to energy consumption and power and cooling systems.

This relentless advancement of AI is only going to continue, and to meet these evolving needs, the data centre industry needs to adapt.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X