IT Brief Australia - Technology news for CIOs & IT decision-makers
Story image
Are we letting the data race cloud our judgement?
Mon, 4th Sep 2017
FYI, this story is more than a year old

It's a question that has been doing the rounds seemingly forever – how do we keep costs down while keeping our data secure?

Cloud is just the latest technology to step into the front lines of the debate.

In general, there is definitely a divide across the industry with some companies racing to embrace new ways to ease their workload woes by investing in infrastructure, others are investing in only mission critical services and farming out their technology needs to external providers.

Managing your data in the cloud is not trivial and ultimately there is no one size fits all solution. But companies should at least be thinking about this before they decide on their provider and platform.

Rapid adoption of public cloud is driving the debate. The need for speed is pushing people to make quick decisions when patience and long-term strategy should be considered a virtue. Often in their haste, the decision is being over simplified into public vs private cloud. I'd argue that both have benefits, but also raise some serious questions.

If you already have a fully functioning, expensive data center then private cloud could be the one for you. You can use your already established infrastructure but the skills and time it takes to manage and maintain a data center is a huge investment.

If you don't have this pre-existing infrastructure the cost of building it can be astronomical. It's just not an option for many. Enter public cloud. Your data is stored in a dedicated provider's data center who is responsible for the management and maintenance. It's cheaper and allows you to get on with the day job.

It can also reduce lead times in testing and deploying new products.

Security, however, raises concerns for many - it is, in fact, the biggest decision driver when it comes to cloud adoption. But in reality, hacks into cloud providers are rare, and the big data breaches causing data loss were legitimate; but stolen, user credentials. Increasingly there is another aspect of public cloud that is giving people sleepless nights.

I wonder, are we fated to repeat the mistakes of the past?

In the 1980s and 1990s companies outsourced their IT with the aim to be more efficient. But when the price of these contracts rose they realised that they lacked the internal skills and their own infrastructure to get themselves out.

Recently we did a survey of participants at F5's AN/Z Agility 2017 conference. We found that half of the attendees were worried about being locked in by public cloud providers.

Platform lock in is not a new concern, but it's an important one. This happens across the IT industry to varying degrees but it's particularly exacerbated by the lack of standardisation across cloud providers. There are no standard interfaces, formats for the actual data or open platforms in which to edit or interchange the data.

This means that even if there weren't hidden costs and ambiguous small print – which often there are – actually migrating data across to a new cloud provider is hugely complex. Companies are finding themselves dependent on a single cloud provider and can't easily move to a different vendor without considerable costs, disruption to services or legal constraints.

This lack of integration has wider implications for the industry. If vendors are able to ‘lock in' companies they will also be able to effectively ‘lock out' any other players. Increasing the barriers of entry into the market leads to less competition and ultimately that hurts the end consumers.

There is some hope. Technology platform lock in has a nasty habit of backfiring. Up until 2005, all digital music was essentially locked into the Apple iTunes ecosystem and available only on one of its devices.

But then in the next few years, a series of monopoly cases against Apple lead to all four major music labels removing Apple's licensing restrictions on all their music by 2009.

All organisations need the flexibility to adapt their IT strategies to fit with business needs. Some might even decide to bring some of the components in-house due to security related risks. But it's not possible to get this kind of flexibility when you can't leave your current provider.

There is a third way - hybrid cloud. It is about picking and choosing the relevant services to suit your needs. The complexity of your requirement and the type of data should determine what provider you go with. Again it goes back to really thinking about future proofing your cloud strategy. Look ahead to two years, five years, even 10 years.

My advice - ringfence the workloads that you want to move into the public cloud and what should be held on private servers. The design architecture should be based on your user's needs and should be a well-considered strategy.

Ultimately, moving workloads can actually be easy and lock in is pretty much irrelevant. It is the wrap around services that are the real issue.

So read the small print and hedge your bets. There is no obligation for cloud providers to play nice with each other. Just take the best of what they have to offer and make sure your business needs are met.