IT Brief Australia - Technology news for CIOs & IT decision-makers
Story image
Is your data centre a ‘carbonivore’?
Mon, 13th Nov 2023

Sustainability has become a boardroom priority for modern organisations. Alongside the pressure of the ongoing climate crisis, there are good business reasons for this to be a focus.

For instance, 68% of prospective employees are more likely to apply for and accept positions from environmentally sustainable companies, while businesses making ESG-related claims see higher growth than their competitors.

However, when paired with ambitious digital transformation strategies, IT and security leaders are left with the challenging brief of building a secure, innovative IT infrastructure while limiting carbon emissions.

The first step in resolving the data centre dilemma is to address the worst culprit: data centres. Businesses generate a near-exponential amount of data, and data centres facilitate the storage and nearly endless flow of this data between servers, tools, virtual machines, and applications, consuming vast amounts of energy in the process.

In fact, data centres account for around two percent of all global carbon emissions each year, boasting a carbon footprint larger than the airline industry.

These ‘carbonivores’ not only contribute huge amounts of carbon emissions to an organisation’s overall footprint, but the electricity needed to run them also leads to hidden costs and operational challenges.

Microsoft has made a global commitment to match 100% of its electricity consumption, 100% of the time, with carbon-free energy purchases globally by 2030. It will also take other steps to reduce its emissions.

The company says it regularly measures the energy efficiency of its data centres around the world using the power usage effectiveness (PUE) metric. This is calculated by dividing a data centre’s total power consumption by the amount of power used to run the IT equipment. A lower PUE score indicates a more energy-efficient data centre, with a PUE of 1.0 being the best score. Microsoft’s new data centre in NZ is expected to have an average PUE score of 1.12, which is line with its new Australian data centres.

Energy consumption is not a new challenge for the data centre industry. Many system and silicon vendors have introduced more energy-efficient hardware in recent years, offering servers and storage devices with lower power consumption and features such as variable fans.

Data centre cooling has also been a source of innovation, optimised through tactics such as free cooling, hot aisle/cold aisle containment, and even constructing new hubs in colder climates for natural cooling effects. However, businesses themselves can limit the consumption of their data centres by reconsidering their network management strategies altogether.

All data centres use various security and monitoring tools to capture data communications through network traffic, each with hidden costs and carbon outputs of their own.

One of the popular network analytics probes used across service providers and enterprises requires up to 586 W of power to process 16Gbps of network traffic. Monitoring 100Gbps of traffic would, therefore, require seven individual probes, consuming the equivalent of roughly 100 home refrigerators in just one year (35,934 kWh).

Some of this traffic processing is not necessary. Streamlining this patchwork of separate tools by determining what traffic is processed by which tools to reduce data duplication allows data centres to run at a much greater data efficiency, cutting energy costs and reducing carbon output.

Gaining true visibility into the network can guide businesses in making the right decisions to streamline their data flows. There are four key tactics to optimise tool usage and eradicate redundant or replicated data packets. When combined, they go a long way towards reducing irrelevant network traffic, in turn cutting energy consumption and building a more sustainable infrastructure.

The structure of modern data centre networks prioritises resiliency and availability. Still, this approach creates duplicate network packets across a network, dramatically increasing the volume of traffic being processed by data centre tools.

Each network packet only needs to be analysed once, and deduplication allows operators to identify and remove duplicates before sending network data to tools, reducing redundancies and thereby requiring far less energy.

1. Application filtering: 
Application filtering separates data based on traffic signature, distinguishing between high- and low-risk applications, even when data is encrypted. High-volume, trusted applications such as YouTube can be filtered out, allowing businesses to focus on data centre tools where they are needed.
This reduces the amount of data flowing across the network and limits the energy use of data centre tools. 

2. Flow mapping:
The process of sending only the relevant network data to meet each tool’s needs flow mapping drastically reduces network traffic and prevents tools from becoming overloaded with information from unnecessary subnets, protocols or VLANs.

3. Flow slicing:
This method of optimisation focuses on reducing the information shared via network packets in every user session. Much like flow mapping, flow slicing functions on the basis that inundating tools with non-essential information wastes valuable energy, and many tools only need to see the initial setup, header, and handshake information.

Flow slicing is highly efficient and can carry a big impact on tool traffic and, ultimately energy consumption: real-world deployments reduce tool traffic by 80 to 95%.

So, how do we achieve true efficiency with deep observability? Efficiency in spending, person-hours and energy usage has been the name of the game for IT and security leaders who are constantly tasked to do more with less. Many cloud journeys began or were escalated in response to the COVID-19 pandemic.

As a result, inefficiency in infrastructure and spending is something that all modern businesses are now contending with, and a focus on carbon emissions is just the latest pressure for businesses to reduce these redundancies.

These tactics do more than reduce the energy requirements of data centres. By streamlining data tooling, they can enable organisations to remove unnecessary data centre tools and reduce sprawl, saving valuable IT spending for more innovation.

The old sustainability mantra, ‘reduce, reuse, recycle’, reminds us that the first and best way to save resources is to not use them in the first place. Deep observability and better data management empower enterprises to do just this.

By applying a more strategic, considered approach, decision-makers can benefit their businesses’ budgets, network uptime, and even the planet.