Many enterprises are still struggling to control their critical data throughout backup as well as application development and testing processes, according to new IDC research commissioned by Actifio, the copy data virtualisation company.
In fact, the research shows that two-thirds of enterprises are failing to meet best practice standards for data control.
According to the research, the use of a copy data virtualisation platform limits the proliferation and availability of physical data copies, when data is both at rest and being moved or migrated between production and non-production environments, generations of hardware, data centres or cloud infrastructure.
Most importantly, copy data virtualisation decreases the number of targets available to those with harmful intent, Actifio says.
Keeping enterprise data both safe and accessible is a complex balancing act made all the more difficult by the geometric growth rate of production data.
When coupled with conventional data protection programmes and a siloed approach to copy data creation, the result is an unintended and uncontrolled proliferation of ‘rogue copies’ of sensitive data, according to the company.
This data is difficult to keep track of, let alone protect. Organisations and individual users dissatisfied with the responsiveness of infrastructure and operations can often lead to ‘shadow IT’ operations that contain such rogue copies.
Each added physical copy increases the ‘surface area of attack’ for this data, creating additional opportunities for the wrong people to get access to confidential or personally identifiable information of clients, says Actifio.
The problem still persists throughout the vast majority of enterprises, despite well publicised high-profile security breaches or data leaks, the company says.
In fact, the IDC research found the majority of enterprises fail to meet best practice standards for data control and few are likely to be consistent across the full spectrum of data security policies.
IDC found that 77% of surveyed organisations fail to mask sensitive data during the test/development phase, which significantly increases the threat of a data breach.
According to the analysts, a typical organisation holds 375 data copies, with each copy carrying sensitive information and therefore an increased risk of attack.
In terms of sectors, government performs best overall at implementing data control policies while the education sector is the weakest.
Within an organisation, the CIO is central to the implementation of data control/security policies - and policies are only applied on an ad-hoc basis 34% of the time.
“Our research clearly identified two major challenges faced by IT executives - the copy data proliferation problem and the copy data access problem,” says Phil Goodwin, IDC research director Storage Systems and Software.
“Copy data is costly, and introduces risk when it needs to be accessed. Organisations need solutions that can automate copy data management and subsequently reduce risk and cost in the enterprise and public sector environments; manual efforts are simply insufficient,” he says.
The enterprise copy data access problem is simply too large and broad for IT organisations to handle manually, IDC says.
By 2018, IDC estimates copy data will cost IT organisations $50.63 billion and currently consumes up to 60% of the IT storage hardware and infrastructure budget.
“The truth is most companies have no idea how many copies of a given data set are floating around in their infrastructure or in the cloud,” says Ash Ashutosh, Actifio CEO.
“If you don’t know how many copies you have you don’t know where they are, and if you don’t know where they are you can’t tell who has access to them,” he says.
The Actifio-commissioned white paper surveyed senior executives at 429 mid-to-large scale enterprises across five industry sectors - government, financial, education, healthcare and retail, and focused on current trends related to data access, management, masking, copy proliferation and tracking.