Story image

Accenture: The importance of trust in data

03 May 18

Article by Peter Vakkas, Accenture’s technology lead for Australia and New Zealand

Businesses today are more data-driven than ever before.

It is estimated that businesses comb through approximately four to five billion data elements a day.

As a workforce, we no longer have the time or the ability to check everything personally but at the same time we can’t afford to make decisions based predominantly on our experience and instincts.

An organisation’s success will depend on how it uses this huge volume of data to drive the most optimum decisions.

Accenture’s Technology Vision 2018 report found that 82% of executives believe their organisations are increasingly using data to drive critical and automated decision-making.

Yet, as artificial intelligence (AI) is used to make more business-critical decisions, inaccurate or manipulated information threatens to compromise the insights that companies rely on to plan, operate, and grow.

So, as businesses continue to use more and more data, how can they improve the accuracy and legitimacy of their data?

Grading the truth of data

When we think of data management, veracity or trust is not always top of mind, but this is all about to change.

While the power of insight through data is known, recent events like the Facebook data breach and impending GDPR laws on consumer privacy are all signalling a focus on the trust element of data.

Businesses are investing heavily in technologies that can help them maximise data-driven insights, but they also need to invest in what’s going into them.

Ensuring the integrity of data is one of the most important challenges in the digital economy. By failing to ensure data integrity, organisations leave themselves open to a new kind of vulnerability - a threat that’s critically overlooked.

A recent Accenture study found 97% of business decisions are made using data that its own managers consider being of unacceptable quality, resulting in business insights and decisions that are of questionable value at best or just “bad” at worst.

This doesn’t need to be the case, the risks of poor data veracity can be managed. Organisations can address this new vulnerability by building confidence in three key data-focused principles:

  • Provenance, or verifying the history of data from its origin throughout its life cycle.
  • Context, or considering the circumstances around its use.
  • Integrity, or securing and maintaining data.

Taking responsibility for data veracity

How can organisations be sure whether their information accurately reflects their reality?

How can organisations guarantee that their data hasn’t been corrupted by malfunctioning assets or tampered with by outside forces?

Grading the veracity of intelligent data can be achieved by refocussing an organisation’s existing strategies: embedding and enforcing data integrity and security throughout the organisation, while adapting existing investments in cybersecurity and data science to address data veracity issues.

These existing capabilities within an organisation’s cybersecurity and analytics practices provide a foundation on which to build a new ‘data intelligence’ practice, which would work to uncover and address the factors contributing to the creation of false data in the first place.

Uncomfortable but true: if a business depends on data collection, they are potentially complicit in incentivising data manipulation.

The new practice will help to build data veracity, and uncover business practices encouraging manipulated data by setting standards for acceptable risk, based on business priorities and implications of automated decisions.

These initiatives will help businesses be confident in their insights, while remaining alert to new potential threats.

Organisations also need to build a strong digital identity foundation as part of their digital transformation. Beyond simply identifying people, businesses will also need to be able to identify devices and processes.

The ability to do so will enable companies to track which system supplied the data, which process created it and whether the data received is as expected.

By merging the separate areas of security operations and anomaly detection, organisations can ensure the integrity, provenance and context of its data, creating a solid foundation for data veracity.

To effectively grade and verify data, organisations need to acquire an understanding of the “behaviour” around data.

This data behaviour can be determined through its origins, whether through a data trail created by online shopping, or sensor network reporting temperature readings for an industrial system.

By building the capability to track and record this data behaviour, organisations can provide cybersecurity and risk management systems with a baseline of expected behaviour around data.

These baselines will equip companies with the insight to detect data tampering that precedes poor decisions.

Data veracity isn’t just about minimising threats. By making these investments, companies will generate more value from their data, establish trust and build a strong foundation for the success of other digital transformation initiatives.

Confidence for the future

Data has become the key for success of digital companies, fuelling complex business decisions that generate continued growth.

It’s therefore crucial that businesses can ensure the integrity of its data, which drives AI systems to make unbiased decisions; renders Internet of Things applications reliable and robotics systems productive.

Trustworthy data is essential to the operation of these smart technology systems and to the overall success of the intelligent enterprise.

In a world where trust in organisations is diminishing, businesses have a responsibility to evaluate their approach to information governance.

To tackle the issue of data veracity, businesses must assess their data management process and determine whether it’s just another cost item on the IT department or an integral part of how value is created.

How organisations respond to this question will determine their ability to compete, be relevant and build trust in tomorrow’s business environment.

Regardless at which point on its digital transformation journey a company may find itself, the time for data veracity is now.

Why A/NZ organisations need to improve compliance protocols
Only a mere 4% of IT decision makers and data managers surveyed said their organisation faced no data management challenges. 
AWS tops all four global markets, APAC a unique case
The order of proceedings remains relatively the same in three of the four major regions for public cloud services providers, but the APAC market is bolstered by the prominence of China.
How artificial intelligence is transforming finance teams
"Organisations using cognitive ergonomics and system design in new AI projects will achieve long-term success four times more often than others.” 
Pure Storage launches new cloud data services
“Customers should be able to make infrastructure choices based on what’s best for their environment, not constrained by what the technology can do."
Is self-service BI living up to the hype?
the explosion of data available to a business and self-service BI tools is transforming how everyone works - but is self-service living up to expectations?
What the people say - Gartner’s November Customers’ Choices
A roundup of the latest Gartner Peer Insight Customers’ Choices from Backup and Recovery to Business Intelligence and Analytics, and more.
How organisations can use AI to generate business insights
DataRobot’s automated machine learning enhanced Precision Marketing’s predictive modelling capabilities.
WA council first to adopt new Datacom tech for local government
The early adopter Shire of Majinup’s initial priority is to use Datascape to help it engage more closely with its community.