How 'data gravity' centres can spell trouble for enterprises
FYI, this story is more than a year old
Data is being created everywhere, constantly.
In the not-too-distant past, data was created in a much more centralised place, and users and systems had far less access to it. Now, with digital data from social, analytics, mobile, cloud, IoT and more being created with both simultaneity and omnipresence, so much information is being collected that it’s forming a ‘centre of gravity’.
As more data is created, this data gravity becomes more robust, and more applications and services start to use it. This leads to a circular, compounding effect of ever-expanding data, and as with the strengthening of this data gravity comes a decreasing ability to move it.
This, in turn, inhibits enterprise workflow performance, raises security concerns, and increases costs, all complicated by regulatory requirements and other artificial constraints.
So how can data gravity be managed? The first step is measuring it - and Digital Realty has recently created a means to do just that.
The Data Gravity Index (DGx) is a global forecast that measures the intensity and gravitational force of enterprise data growth for 21 metropolitan areas across the world. It measures data intensity in gigabytes used per second, and according to the index, data gravity is due to intensify considerably in the coming years.
Data gravity intensity will grow by a compound annual growth rate (CAGR) of 139% globally by 2024 as data stewardship drives global enterprises to increase their digital infrastructure capacity to aggregate, store and manage the majority of the world’s data.
By the same year, Forbes Global 2000 enterprises will create data at a rate of 1.1 million gigabytes per second and will require 15,635 exabytes of additional data storage annually. These enterprises are at the greatest risk of a rapidly increasing data gravity score, as they spend $2.6 trillion annually on IT infrastructure & networking while operating the most complex systems and serving millions of users and endpoints.
“We’ve seen that Data Gravity not only attracts data but makes both data and services that rely on it exponentially more difficult to move," says Dave McRory, who led the research on the data gravity index.
"This gives cities with a particular weight in one industry, like Singapore’s robust financial services space or Japan’s established manufacturing sector, a huge advantage as they naturally attract more of the same kind of data and services – and with them businesses.
"This also makes it more challenging to attract opportunities away from them. For businesses, it’s less advantageous. Data has become a key strategic resource, but data gravity means too much of it can be difficult to use and impossible to move while constantly creating and attracting more.”
Singapore leads the metro areas with a predicted 200% CAGR in 2024, followed by Hong Kong (177%), Sydney (159%), and Tokyo (155%).
These statistics present enormous obstacles for enterprises, whose current backhaul architecture is not equipped to deal with their data gravity needs, either now or in the future.
To deal effectively with data gravity, enterprises must prioritise a connected community approach by integrating core, cloud and edge at centres of data exchange, implementing a secure, hybrid IT and data-centric architecture globally at points of business presence.
Data gravity is a fundamental and urgent challenge posed to global organisations, with far-reaching consequences if it isn’t addressed.
Enterprises doing business in metropolitan areas with intense data gravity, or those seeking to expand to such centres, would do well to ask: what is the data gravity index score of this centre, and how will this affect future data management and growth?
To find out more about Digital Realty’s data gravity index and the methodology used to create it, click here.