Friday, September 13, 2024

Inferring Typing From Sounds and Eyes

Mark Stockley:

The technique, developed at Durham University, the University of Surrey, and Royal Holloway University of London, builds on previous work to produce a more accurate way to guess your password by listening to the sound of you typing it on your keyboard.

The slight differences in the sounds each key makes is an unintentional leak of information, known as a “side channel”.

Michael Nolan (paper):

While the technique presented in this paper relies on contemporary machine-learning techniques, such attacks date back at least to the 1950s, when British intelligence services surreptitiously recorded mechanical encryption devices employed by the Egyptian government.

Matt Burgess (Hacker News):

Today, a group of six computer scientists are revealing a new attack against Apple’s Vision Pro mixed reality headset where exposed eye-tracking data allowed them to decipher what people entered on the device’s virtual keyboard. The attack, dubbed GAZEploit and shared exclusively with WIRED, allowed the researchers to successfully reconstruct passwords, PINs, and messages people typed with their eyes.

[…]

To be clear, the researchers did not gain access to Apple’s headset to see what they were viewing. Instead, they worked out what people were typing by remotely analyzing the eye movements of a virtual avatar [EyeSight] created by the Vision Pro.

Joe Rossignol:

The proof-of-concept attack was not exploited in the wild, according to the report. Nonetheless, Vision Pro users should immediately update the headset to visionOS 1.3 or later to ensure they are protected, now that the findings have been shared publicly.

Previously:

Comments RSS · Twitter · Mastodon

Leave a Comment