Story image

Storage squeeze: Why 2016 is the year of consolidation in the storage industry

28 Apr 16

Article by Tim Jones, senior technical specialist at Tintri

Today’s storage market is a tussle between old and new. While the technology has changed considerably in the enterprise storage market since the late nineties, the advent of both flash and VM-aware storage is further shaking up the landscape.

So how will the industry cope?

Storage through the years

When I started out, enterprise applications predominantly ran on centralised compute platforms. Skip ahead several years and the market had changed significantly; mid-range and micro (PC) based server systems were proliferating and most systems were using RAID (redundant array of inexpensive disks). At this time, the server was essentially purpose built with storage to meet the needs of the application.

In the late nineties and early 2000s, Storage Area Network (SAN) systems were introduced. These systems eased the management difficulties of decentralised servers and changed how enterprise storage was used and consumed. Shortly after this, Network Attached Storage (NAS) came onto the scene, providing storage to the masses and offering another option for shared server storage.

Having these systems meant the overall environment could be more efficient and still provide good throughput to client systems. If you required high speed data storage, the usual outcome was an entire rack filled with 73GB drives – the more disk spindles you had, the greater the performance. Hence for these systems, performance was expensive and capacity was cheap.

A new datacentre technology emerged around 2007 – virtualisation. Not only did this mean abstraction at the storage level, but also at the server level. As a result, the server design that was carefully planned for a SQL server in 1998 was no longer a simple matter. We needed to provide storage that was fast and large enough for multiple systems, and diagnose problems when one of the servers was misbehaving.

But the real turning point for storage was in 2010 when SSD and flash systems came on the market, and the traditional view of high-speed storage systems was flipped on its head – performance was cheap, capacity expensive.

Where’s the market heading?

What we now have is a slew of new players in the enterprise storage marketplace and a revolution in SAN and NAS system architectures to support SSD. With performance as table stakes, it’s management effort that’s the current differentiator between vendors.

According to Tintri’s annual State of Storage report, which surveyed hundreds of datacentre professionals globally, manageability is now acknowledged as the biggest storage pain - leapfrogging performance as the greatest thorn in a datacentre’s side.

When asked what steps they were taking to address these challenges, 68 percent of datacentre professionals said they were evaluating new technologies and 48 percent were evaluating new storage vendors.

We are also witnessing growing momentum behind VAS (VM-aware storage) arrays. VAS is specifically designed to overcome the shortcomings of the highly abstracted environment we see for 90-plus percent of the server workloads in the enterprise that are virtualised. Indeed, Tintri’s State of Storage report found that 52 percent of organisations are looking into boundary-pushing, virtualisation-specific VM-aware storage.

These macro trends are putting the squeeze on legacy storage providers that lack the agility to respond. Dell and EMC are merging into an even larger (and presumably less flexible) entity. NetApp has announced layoffs in the wake of declining product revenues. And other upstarts have struggled to find their footing amidst all the chop.

With all the change happening in the technology and in the market, now is the time to stay focused on the players who have the most compelling all-flash and VM-aware storage offerings. That’s the way to avoid the tussle and keep storage simple.

Article by Tim Jones, senior technical specialist at Tintri

How McAfee aims to curb enterprise data loss
McAfee DLP aims to help safeguard intellectual property and ensure compliance by protecting sensitive data.
HPE promotes 'circular economy' for end-of-use tech
HPE is planning to show businesses worldwide that throwing old tech and assets into landfill is not the best option when it comes to end-of-use disposal.
2018 sees 1,500% increase in coinmining malware - report
This issue will only continue to grow as IoT forms the foundation of connected devices and smart city grids.
CSPs ‘not capable enough’ to meet 5G demands of end-users
A new study from Gartner produced some startling findings, including the lack of readiness of communications service providers (CSPs).
Oracle announces a new set of cloud-native managed services
"Developers should have the flexibility to build and deploy their applications anywhere they choose without the threat of cloud vendor lock-in.”
How AT&T aims to help businesses recover faster from a disaster
"Companies need to be able to recover and continue operations ASAP, without pulling resources from other places to get back up and running."
2019 threat landscape predictions - Proofpoint
Proofpoint researchers have looked ahead at the trends and events likely to shape the threat landscape in the year to come.
How your enterprise backup solution could fail
Even the best-trained employees are prone to error, and unfortunately, sometimes those errors affect enterprise backups.