Story image

Leveling the playing field: Why clean data makes for clean elections

30 Jul 18

As we approach August 2018, opening the newspaper or browsing through social media feeds one would think we are stuck in November 2016. The reason for this is that the result and mechanics of the American election is still very much relevant due to revelations about Cambridge Analytica, Facebook and how the misappropriation of private data played in influencing an election.

Data is the lifeblood of political parties. It enables them to plan their campaign, target campaign spend and make important decisions relating to policy. Parties traditionally only have access to limited pools of data and rely on surveys, polls, and public reports to garner insights. This is because existing privacy legislation prevents political parties (or anyone for that matter) from gleaning private citizen’s data without their consent.  

This system engenders a level playing field during elections and prevents the public from being exploited for political gain. However, in the case of the 2016 U.S. election, private data was suddenly in play and the result is still being debated today in courts and the public domain. Essentially, because of a data breach we are stuck in a Groundhog Day with no end in sight.

Future elections under a cloud

So what does this mean for elections moving forward? Are we to assume they will all be compromised by private data falling into the wrong hands? For Australia, which will be holding a number of state elections and the strong possibility of a federal election, the answer lies in the cloud.

While Cambridge Analytica has deleted its illegal private data and vowed never to utilise it again, it is certainly not ‘mission accomplished.’ This is because, unlike selfies, your private data is not confined to social media alone.

Citizen data lives in public library databases, such as Centrelink, Medicare, local transport authorities, the DHS and thousands of other places. As such, it is important to consider what effect a Cambridge Analytica style overreach into our public sector’s data might have on our nation. Especially in the context of the Federal Government’s Cloud First Policy which underpins its Digital Transformation strategy.

As more government agencies move to the cloud, they are also moving data. Not physically but geographically. For example, while local Council’s voter data was once stored on a server in its building, a move to the cloud could see the files being stored in a data centre located anywhere from Melbourne, Victoria to Melbourne, Florida.

A tale of two cities

Exploring these two hypothetical scenarios further reveals significant issues around data security and data sovereignty. When data is entrusted to an Australian cloud provider with data centres in Australia, its data is safe under Australian jurisdiction and cannot be tampered with by foreign or private parties.

When government data is entrusted to international cloud providers, the outlooks is not so clear. Suppose Australian data is stored in Melbourne, Florida; the recent U.S. Cloud Act dictates that the government can legally access this data on the basis of it being managed by a U.S. company. Furthermore, if the U.S. company is operating the data centre in Australia, the U.S. government can still express legal claim over the data if it chooses.

Protective measures in place

With these threats in mind, The Australian Signals Directorate (ASD) have in place strict requirements for the provision of cloud services amongst government bodies dealing with private data. Those being that they must be Australian companies exclusively operating Australian data centres which are operated by security cleared IT staff.

It may sound extreme, but when you consider that almost two years on we are still picking up the pieces of a U.S. election data scandal… perhaps ‘extreme’ is what data security measures need to be to ensure clean, fair and transparent elections in Australia. Hence, government departments need to be thoughtful about how they move to the cloud and understand what they are doing. We live in a world where trust in government is easily lost and hard to gain.

Article by Vault Systems founder and CEO, Rupert Taylor-Price.

Disruption in the supply chain: Why IT resilience is a collective responsibility
"A truly resilient organisation will invest in building strong relationships while the sun shines so they can draw on goodwill when it rains."
The disaster recovery-as-a-service market is on the rise
As time progresses and advanced technologies are implemented, the demand for disaster recovery-as-a-service is also expected to increase.
Cohesity signs new reseller and cloud service provider in Australia
NEXION Networks has been appointed as an authorised reseller of Cohesity’s range of solutions for secondary data.
The key to financial institutions’ path to digital dominance
By 2020, about 1.7 megabytes a second of new information will be created for every human being on the planet.
Proofpoint launches feature to identify most targeted users
“One of the largest security industry misconceptions is that most cyberattacks target top executives and management.”
What disaster recovery will look like in 2019
“With nearly half of all businesses experiencing an unrecoverable data event in the last three years, current backup solutions are no longer fit for purpose."
NVIDIA sets records with their enterprise AI
The new MLPerf benchmark suite measures a wide range of deep learning workloads, aiming to serve as the industry’s first objective AI benchmark suite.
McAfee named Leader in Magic Quadrant an eighth time
The company has been once again named as a Leader in the Gartner Magic Quadrant for Security Information and Event Management.