IT Brief Australia - Technology news for CIOs & IT decision-makers
Story image
Thu, 9th Nov 2023

Efficiency lies at the heart of business success. From the finance department to human resources, optimised workflows are the number one way to stand out to their customers and meet business goals.

With digitalisation growing rapidly, every modern organisation is underpinned by its IT networks. Efficient networks don’t just ensure uptime and great user experience; they can drastically reduce IT spending.

Energy-hungry data centres are costly and have a sizeable impact on an organisation’s ability to meet ESG targets, so there is no shortage of reasons to bring an efficiency mindset to handling network traffic.

But how can IT and security leaders meet the challenging brief of building a secure, innovative IT infrastructure that runs efficiently, eliminates unnecessary energy consumption, and reduces the carbon footprint for the organisation?

Between increasing cloud adoption, remote working practices, and the immense amount of business data created each day, business traffic is at an all-time high. The increased popularity of data-intensive activities such as streaming, gaming, and virtual or augmented reality is driving network demand even higher, placing communication service providers under further strain.

These high-traffic consumers are great for business, but they come at a cost and not just a financial one.

Data centres are immensely expensive in terms of storage space and energy. In recent years, the data centre industry has worked to introduce more energy-efficient hardware, with low-power servers and storage devices and cooling features.

From free cooling to hot aisle/cold aisle containment to moving data centre hubs themselves to colder climates, organisations have tried to reduce consumption and improve the efficiency of these much-needed machines. But the simpler solutions lie in the traffic itself.

Data centres use a network of security and monitoring tools to secure and capture insights from network traffic. Each one of these tools brings hidden costs and energy inputs, and these can tally up fast.

Take a typical network analytics probe used by enterprises and service providers. To process 16Gps of network traffic, the probe would consume up to 586 W of power. With this tool, 100Gbps of traffic would require seven individual probes for monitoring.

Over just one year, these probes would consume the equivalent of roughly 100 domestic refrigerators, or just under 36,000 kWh.

Some of this traffic processing is not necessary. The IT industry is a hub for innovation, but it often struggles to reassess what has become a norm. Reconsidering network management strategies altogether and implementing a few smart tactics can go a long way towards reducing the space, energy, expensive tooling, and valuable time eaten up by data centres each day.

Streamlining this patchwork of separate tools by determining what traffic is processed by which tools to reduce data duplication allows data centres to run at a much greater data efficiency, cutting energy costs and reducing carbon output.

Gaining true visibility into the network can guide businesses in making the right decisions to streamline their data flows.

There are four key tactics to optimise tool usage and eradicate redundant or replicated data packets. When combined, they go a long way towards reducing irrelevant network traffic, in turn cutting energy consumption and building a more sustainable infrastructure.

The structure of modern data centre networks prioritises resiliency and availability, but this approach creates duplicate network packets across a network, dramatically increasing the volume of traffic being processed by data centre tools.

Each network packet only needs to be analysed once, and deduplication allows operators to identify and remove duplicates before sending network data to tools, reducing redundancies and thereby requiring far less energy.

Application filtering separates data based on traffic signature, distinguishing between high and low-risk applications, even when data is encrypted. High-volume, trusted applications such as YouTube can be filtered out, allowing businesses to focus data centre tools where they are needed.

This reduces the amount of data flowing across the network and limits the energy use of data centre tools.

Flow mapping is the process of sending only the relevant network data to meet each tool’s needs; flow mapping drastically reduces network traffic and prevents tools from becoming overloaded with information from unnecessary subnets, protocols or VLANs.

Flow slicing is a method of optimisation focused on reducing the information shared via network packets in every user session. Much like flow mapping, flow slicing functions on the basis that inundating tools with non-essential information wastes valuable energy, and many tools only need to see the initial setup, header and handshake information.

Flow slicing is highly efficient and can carry a significant impact on tool traffic and energy consumption: real-world deployments reduce tool traffic by 80 to 95 percent.

One of the most powerful capabilities for reducing the volume of unnecessary or irrelevant data sent to tools that require data sets as opposed to network packets is being able to intelligently define and select specific metadata elements from the traffic for specific applications.

In many cases and for some applications, this can reduce the data sent by as much as 95%, just giving the tool the data attributes of the traffic it needs.

Efficiency, in spending, man hours and energy usage, has been the name of the game for IT and security leaders who are constantly tasked to do more with less. Many cloud journeys began or were escalated in response to the COVID-19 pandemic.

As a result, inefficiency in infrastructure and spending is something that all modern businesses are now contending with, and a focus on carbon emissions is just the latest pressure for businesses to reduce these redundancies.

These tactics do more than reduce the energy requirements of data centres. By streamlining data tooling, they can enable organisations to remove unnecessary data centre tools and reduce sprawl, saving valuable IT spending for more innovation.

Certain tools can see a 95% reduction in traffic through the combination of filtering, deduplication and application metadata. Even more importantly, the fidelity and relevance of the 5% of information the tools now get is just as high; it’s just that the tool didn’t have to waste its resources filtering the 100% to find the information of value.

Other tools that may prefer to see nearly all traffic may see only a 25% reduction through modest use of filtering. In all scenarios, organisations can better manage and measure power usage in the tooling infrastructure.

The old sustainability mantra, ‘reduce, reuse, recycle,’ reminds us that the first and best way to save resources is to not use them in the first place. Deep observability and better data management empower enterprises to do just this. By applying a more strategic, considered approach, decision-makers can benefit their businesses’ budgets, network uptime, and even the planet.