Story image

Focus on quality over quantity with server deployment, says exec

02 Feb 17

Server memory is a component that’s either sufficient or insufficient and you likely don’t even think about RAM because other problems consume your attention.

That’s according to Crucial, global brand of Micron Technology located in both Australia and New Zealand.

The organisation adds that when you have insufficient memory, your servers and your organisation’s productivity slow to a crawl because DRAM feeds your CPUs.

Michael Moreland, marketing manager for Crucial Server DRAM, explains that commissioning a recent survey enabled them see and hear what challenges IT pros are experiencing in their data centres.

“Based on what they’ve said, it’s clear that they believe server memory is critical for improving system performance,” he says.

“Adding more memory to servers improves CPU performance and efficiency, which ultimately helps alleviate the top five workload constraints they mentioned: limited budget, unexpected or unpredictable workload demands, limited floor space, rapid growth in user base, and power or cooling costs.”

The survey asked organisations what the top challenges they were currently facing when overcoming server workload constraints were.

There were 353 respondents selected by Spiceworks that required to have purchase influence in their organisation and were required to have at least 30 physical servers and be using virtualisation software.

Overall, 23 industries were represented (ranging from technology to energy to manufacturing) and 74% of respondents were running 100 or more physical servers, with 41% running over 200 boxes.

“When we surveyed the 350-plus IT managers from around the world, they listed a limited budget as their top workload constraint. Making the most of scarce resources is a hallmark of modern IT, and that’s why it’s critical to keep the total cost of ownership (TCO) down,” adds Moreland.

“That’s where adding server memory can help – maxing out a server’s memory provides fuel for the CPU to run optimally, allowing you to use fewer servers to accomplish more. With each server functioning more efficiently, it limits the power, cooling, and burdensome licensing costs that come with having more servers in your server room.”

Moreland explains that memory actually helps maximize limited floor space and that scaling up almost always involves increasing a server’s installed memory capacity to get as much out of the box and feed as many VMs as possible

“Virtualised applications are heavily dependent on active data, and when there’s a spike in workload activity, available server memory resources are depleted and QoS drops. Filling your servers to their maximum RAM capacities reduces the strain on the systems when activity gets intense,” he says.

“Limited square footage in your server room places a premium on making the most of each individual server. When you don’t have the physical space to scale out, scaling up by maxing out the memory of each server can ultimately match the performance of multiple half-full ones.”

He explains that memory is like fuel for your CPUs – as long as they have enough of it, they’re OK.

“But there’s a significant difference between having enough RAM and truly improving workload efficiency. With just enough RAM, you’re certainly able to run applications, but with the maximum installed memory capacity, you’re often able to use fewer servers to get more done at a lower total TCO,” says Moreland.

“In the case of servers, more isn’t always better, especially when they can be maxed-out with memory and meet or surpass the same level of performance of half-full servers,” he adds.

“Focus on quality, not quantity, when it comes to your server deployment and reduce your power, cooling, and license costs.”

WatchGuard’s eight (terrifying) 2019 security predictions
The next evolution of ransomware, escalating nation-state attacks, biometric hacking, Wi-Fi protocol security, and Die Hard fiction becomes reality.
Why the adoption of SAP is growing among SMEs
Small and medium scale enterprises are emerging as lucrative end users for SAP.
Exclusive: How the separation of Amazon and AWS could affect the cloud market
"Amazon Web Services is one of the rare companies that can be a market leader but remain ruthlessly innovative and agile."
HPE extends cloud-based AI tool InfoSight to servers
HPE asserts it is a big deal as the system can drive down operating costs, plug disruptive performance gaps, and free up time to allow IT staff to innovate.
Digital Realty opens new AU data centre – and announces another one
On the day that Digital Realty cut the ribbon for its new Sydney data centre, it revealed that it will soon begin developing another one.
A roadmap to AI project success
Five keys preparation tasks, and eight implementation elements to keep in mind when developing and implementing an AI service.
The future of privacy: What comes after VPNs?
"75% of VPN users said they are seeking a better solution for cloud networks."
'Public cloud is not a panacea' - 91% of IT leaders want hybrid
Nutanix research suggests cloud interoperability and app mobility outrank cost and security for primary hybrid cloud benefits.