IT Brief Australia - Technology news for CIOs & IT decision-makers
Story image
AI listening in - researchers uncover new security threat
Thu, 10th Aug 2023

A new University study has revealed that artificial intelligence (AI) can identify passwords by sound of keys being pressed with more than 90% accuracy.

Typing a computer password while chatting over a video call like Zoom could open the door to a cyber attack, with a new study revealing AI can work out which keys are being pressed by eavesdropping on the sound of the typing.

Industry experts say that as video conferencing tools such as Zoom have grown in use, and devices with built-in microphones have become ubiquitous, the threat of cyber attacks based on sounds has also risen. Now, this may give rise to a new kind of cyber attack.

Dr Ehsan Toreini, co-author of the study at the University of Surrey, comments, "I can only see the accuracy of such models, and such attacks, increasing. [...] With smart devices bearing microphones becoming ever more common within households, such attacks highlight the need for public debates on governance of AI."

The research, published as part of the IEEE European Symposium on Security and Privacy Workshops, reveals how Toreini and colleagues used machine learning algorithms to create a system able to identify which keys were being pressed on a laptop based on sound an approach that researchers deployed on the Enigma cipher device in recent years.

The study reports how the researchers pressed each of 36 keys on a MacBook Pro, including all of the letters and numbers, 25 times in a row, using different fingers and with varying pressure. The sounds were recorded both over a Zoom call and on a smartphone placed a short distance from the keyboard.

Joshua Harrison, first author of the study, from Durham University, says the team then fed part of the data into a machine learning system which, over time, learned to recognise features of the acoustic signals associated with each key. While it is not clear which clues the system used, Harrison says it was possible an important influence was how close the keys were to the edge of the keyboard.

The results reveal that the system could accurately assign the correct key to a sound 95% of the time when the recording was made over a phone call, and 93% of the time when the recording was made over a Zoom call.

“Zoom takes the privacy and security of our users seriously. In addition to the mitigation techniques suggested by the researchers, Zoom users can also configure our background noise suppression feature to a higher setting, mute their microphone by default when joining a meeting, and mute their microphone when typing during a meeting to help keep their information more secure.”

Commenting on the findings, cyber expert Suid Adeyanju, CEO of RiverSafe says, "This study should serve as a wake-up call about the true risks posed by artificial intelligence when the technology is hijacked by cyber criminals."

"Far too many organisations are rushing to adopt the technology without conducting even the most basic due diligence tests and in total disregard for standard security protocols."

Adeyanju says, "Over enthusiastic executives should take note that AI may look like Barbie, but it could turn out to be Oppenheimer if the necessary cyber protections and regulatory procedures arent in place."

The study, which is also authored by Dr Maryam Mehrnezhad from the Royal Holloway, University of London, is not the first to show that keystrokes can be identified by sound. However, the team say their study uses the most up-to-date methods and has achieved the highest accuracy so far.

While the researchers say the work is a proof-of-principle study, and has not been used to crack passwords which would involve correctly guessing strings of keystrokes or in real world settings like coffee shops, they say the work highlights the need for vigilance.

Laptops with their similar keyboards and common use in public places are at high risk, the researchers state, however similar eavesdropping methods could be applied to any keyboard.

The researchers add there are a number of ways the risk of such acoustic side channel attacks can be mitigated, including opting for biometric passwords where possible or activating two-step verification systems. Failing that, they say it's a good idea to use the shift key to create a mixture of upper and lower cases, or numbers and symbols.

Prof Feng Hao from the University of Warwick, who was not involved in the new study, says people should be careful not to type sensitive messages, including passwords, on a keyboard during a Zoom call.

Hao concludes, "Besides the sound, the visual images about the subtle movements of the shoulder and wrist can also reveal side-channel information about the keys being typed on the keyboard even though the keyboard is not visible from the camera."