AI News

Protect Your Data: Keyboard Sound Can Lead to Theft

Artificial intelligence algorithms are increasingly infiltrating various fields. A recent study has revealed that an AI model, once adequately trained, could potentially use the sounds of keyboard strokes to gain access to your data.

For individuals who often participate in conference calls, it might be prudent to reconsider typing on your computer while your microphone is on.

This caution stems from a recent study by British researchers who successfully trained an artificial intelligence model to interpret keyboard strokes during conference calls. Impressively, this model can identify the content you type just by analyzing the sound of your keystrokes, demonstrating exceptional accuracy.


Your keyboard sounds might compromise your data

Protect Your Data: Keyboard Sound Can Lead to Theft

The research underscores that while individuals often protect their screens or keyboards when typing passwords, they commonly overlook the sound cues their keyboards produce. Yet, this sophisticated AI can decipher nearly everything typed into a computer, including sensitive emails or crucial documents. Remarkably, the AI achieved a 95 percent accuracy rate when trained with keystrokes captured by a phone placed nearby. Its accuracy decreased slightly to 93 percent when the keystrokes were recorded through a Zoom call.

Although the AI may not always transcribe perfectly, the errors it typically makes involve missing a keystroke or misidentifying one. Still, such minor inaccuracies could aid potential hackers.

It’s essential to note that for the AI to operate at its best, it requires initial training. Training alone isn’t enough; the attacker must also possess a clear recording of the target’s keyboard sounds. In the mentioned study, training was conducted using a MacBook Pro and an iPhone 13 mini. The team pressed each key on the computer 25 times, recording these sounds with a phone situated about 15 inches away.

Despite these revelations, there’s no need for excessive concern. Most people use various keyboards or laptops, each with unique sound profiles. Therefore, for any malicious use, the AI model would need to be specifically calibrated to the target’s device. For added security, individuals might consider changing their typing habits or introducing background noise when entering sensitive information.


You may also like this content

Follow us on TWITTER (X) and be instantly informed about the latest developments…

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button