DolphinAttack – How hackers can take control of Siri and Alexa

Researchers at Zhejiang University have developed a way to activate and use speech recognition systems using frequencies humans can’t hear, TechCrunch reported.

The attack works by exploiting the natural characteristics of microphones called harmonics.

Dubbed “DolphinAttack”, the team tested the hack on Siri, Google Now, Samsung S Voice, Huawei HiVoice, Cortana, and Alexa.

They could execute wake phrases such as “OK Google”, as well as dial numbers and make other multi-word requests like “unlock the back door”.

Success rates varied by phone, phrase, and distance – with the attack not working effectively when there was 1.5m between the ultrasonic transducer and the device.

Users can also disable wake phrases, and many phones restrict access to sensitive areas until the device has been unlocked.

DolphinAttack

Now read: Why you must put tape over your laptop camera: Security expert

Latest news

Partner Content

Show comments

Recommended

Share this article
DolphinAttack – How hackers can take control of Siri and Alexa