IT Brief Australia - Technology news for CIOs & IT decision-makers
Story image

Understanding data is the first step in public sector cloud adoption

Wed, 23rd Sep 2020
FYI, this story is more than a year old

While recent changes to procurement guidelines have cut some of the red tape keeping government departments from taking advantage of the cloud, many agencies are still figuring out where to start on their cloud journey.

Earlier this year, the Cloud Services Certification Program (CSCP) and the associated Certified Cloud Services List (CCSL) were disbanded and replaced with Cloud Security Guidance issued by the Australian Cyber Security Centre (ACSC) and the Digital Transformation Agency (DTA).

While there are enough acronyms there to make anyone's eyes glaze over, the change this represents is significant. Agencies now generally have more freedom to choose which cloud providers they partner with, as long as those providers pass a comprehensive risk and security assessment.

Previously, agencies could only engage with a handful of cloud service providers who had been pre-vetted by the Australian Signals Directorate. The recent changes not only make it easier for government bodies to adopt cloud, but it opens the door for more local Australian businesses to deliver their services to the public sector.    

In essence, it gives government departments more power to 'choose their own adventure' – but you can't choose your own adventure if you don't know where to start.

And every cloud migration starts with data.

A place for everything

Before any cloud migration, it is essential to know precisely what data the organisation already has and where it's located.

Organisations with the size and scale of government departments have immensely large data sets, so the first step should always be identifying high-value or confidential data to ensure the appropriate security measures can be applied.

Despite robust data handling processes, years of manual data classification and storage can lead to blind spots where human error has caused files to be catalogued and stored incorrectly. If this is not identified, once data sets are migrated to the cloud, there's the risk that sensitive data is incorrectly stored without the necessary security and access controls.

It seems this exact scenario unfolded earlier this month when 50,000 NSW drivers licenses were mistakenly left exposed for anybody to access. While the business responsible for the data breach is yet to be identified, it is a perfect example of what can wrong when data sets are stored on the cloud without the necessary due diligence.

Preliminary results from a NSW Government investigation into the breach found the licences were made public due to misconfigured privacy settings on the business' cloud service.

It is important to note that just because data contains personally identifiable information, it doesn't immediately preclude it from being stored on the cloud. However, it does mean that the organisation possessing it must first identify that it has sensitive data, and classify it as such, before migrating it to cloud so appropriate privacy settings and access configurations can be enforced.

So, the first fundamental step in migrating to the cloud is understanding the data an agency owns, where it resides, how it is classified, and what security measures it requires.

Everything in its place

Manually sifting through decades and terabytes of on-premise data to ensure everything is stored correctly and classified makes searching for a needle in a haystack seem like a pleasant pastime hobby.

The labour hours involved in such an endeavour would quickly negate any potential cost-savings a government department might gain from migrating to the cloud.

On top of the cost, there is also the potential for human-error to misclassify or miss classified data during the process. With appropriate data governance, this process can be automated with machine learning models doing the heavy lifting of helping to discover, classify, and protect sensitive data across the department's environment.

Not only do these algorithms help identify misplaced data in the first instance, but they can also continuously run in the background of an environment and help identify any future exposure risks so they can be immediately remediated.

Critically, once the migration is completed, the same machine learning models can help monitor data across both the new cloud environment, as well as the remaining on-premise infrastructure, to ensure all data adheres to the relevant security and access protocols.  
         

Fear of the unknown

For many agencies, there is a prevailing idea that the majority of government data is people's personal information and there's a strong apprehension around losing control of that data once it 'leaves the building'.

This, above anything else, is the most significant impediment to cloud adoption among the public sector.

Despite this fear of the unknown, government bodies have much to gain from embracing the cloud. This includes increasing the pace of delivering new platforms, providing citizens with easier access to services, and reducing the effort and cost of maintenance while allowing agencies to focus on improving service delivery – rather than merely 'keeping the lights on'.

While some data is too sensitive ever to leave the purview of the IT Team, a mature approach built upon the knowledge of what data an agency holds, where it is held, and how it should be protected, can enable any department to confidently find a place for everything and ensure everything is within its place.  

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X