Australian organisations are continually under cyber threat with the increasing sophistication of cybercriminals and more focused attacks targeting government and industry. The best line of defence may now likely be found in automation. Artificial intelligent (AI)-powered automation can limit errors caused by human intervention, reduce human exposure to sensitive data and reduce the cost of a data breach. In fact, recent data shows that there is a direct correlation between higher levels of automation and the cost of mitigating a data breach, with fully automated security reducing this cost by 59% compared to situations where automation is not deployed.
Automation has an increasing role in bolstering cyber security for organisations of all sizes, especially its use for behavioural analysis, pattern recognition, and an improved response time to potential security threats. In its recent budget announcement, the Australian government committed more than $100 million to help Australian businesses adopt AI and quantum tech, highlighting the government’s effort to realise the AI future and the importance of AI-powered automation in enhancing cyber security.
Addressing the skills gap
Ransomware, phishing, and malware continue to plague Australian organisations daily. Human error plays a significant role in costly data breaches, and high threat volumes combined with a nationwide lack of cybersecurity experts exacerbate the issue and increase the likelihood of mistakes.
According to IBM research, globally, organisations can close this skills gap and mitigate costs by incorporating automation into their operations. Organisations with a fully deployed security automation strategy had an average breach cost of AUD4.38 million, whereas those with no automation strategy had an average breach cost of AUD10.1 million, the study found.
Employing automation and AI to aid in threat detection and remediation has the potential to save organisations millions in damages in the event of a data breach while also mitigating the shortage of skilled labour in cybersecurity.
Digital assistants can bolster security
Even in organisations with a well-staffed, talented team of cybersecurity experts, employees can become easily overburdened and experience alert fatigue when alerts occur in high volumes.
When there is an overabundance of potential threats flagged, organisation and prioritisation can lag. According to a survey conducted by IDC and FireEye, more than one-third of IT security managers and security analysts admit to ignoring threat alerts once their queue is full.
Automating even a few of a cyber engineer’s daily tasks, such as defining security rules, scanning for threats, and tracking access control to critical data, can make a huge difference in their ability to focus on higher-value tasks.
When deployed together, AI and automation can separate the security breach alerts that require immediate attention from false alarms and ensure that the most critical alerts are moved to the top of the queue.
Digital assistants can also employ event-driven automation to trigger endpoint detection and response and then perform the appropriate remediation action. Manual threat hunting can be expensive and inefficient, but with automation, robots can operate self-sufficiently and accurately when performing repetitive tasks, which saves time, money, and effort in the long run.
Striking the balance
Of course, many public and private sector organisations have already invested in a myriad of tools, solutions, and AI to defend against cyber intrusion. The real opportunity lies in automating between and across the technologies, people, and processes of this landscape to drive faster and more cohesive responses versus continuing to invest in new tools and capabilities.
Traditional tools and techniques that rely on indicators of compromise to identify threats, such as unusual network traffic or an increase in incorrect logins, are effective to a point, but when combined with automation, AI, and machine learning technologies, they help cybersecurity analysts achieve greater detection and more easily perform subsequent manual tasks such as blocking IP addresses in firewall systems.
Automation and AI are not a replacement for human experts but rather a means to empower workers to focus on important tasks, supplement their abilities, and fill in the skills gaps that organisations tend to experience.
These technologies can identify behavioural patterns and risks that humans can’t, but human oversight is crucial to ensuring their ethical use and deciding what data is flawed versus reliable.
The alliance between humans and bots is where the real potential for improvement lies, and we are likely to see both government agencies and private sector companies capitalise on that potential in the coming year.