Chinese researchers have discovered a terrifying vulnerability in voice assistants from Apple, Google, Amazon, Microsoft, Samsung, and Huawei. It affects every iPhone and Macbook running Siri, any Galaxy phone, any PC running Windows 10, and even Amazon’s Alexa assistant.
Using a technique called the DolphinAttack, a team from Zhejiang University translated typical vocal commands into ultrasonic frequencies that are too high for the human ear to hear, but perfectly decipherable by the microphones and software powering our always-on voice assistants. This relatively simple translation process lets them take control of gadgets with just a few words uttered in frequencies none of us can hear.
The researchers didn’t just activate basic commands like “Hey Siri” or “Okay Google,” though. They could also tell an iPhone to “call 1234567890” or tell an iPad to FaceTime the number. They could force a Macbook or a Nexus 7 to open a malicious website. They could order an Amazon Echo to “open the backdoor.” Even an Audi Q3 could have its navigation system redirected to a new location.
“Inaudible voice commands question the common design assumption that adversaries may at most try to manipulate a [voice assistant] vocally and can be detected by an alert user,” the research team writes in a paper just accepted to the ACM Conference on Computer and Communications Security.
Dolphin attack: inaudible voice command
„Tendinţele pe care le-am remarcat înainte de începerea pandemiei s-au accelerat pe perioada stării de urgenţă. Am văzut acest lucru ca o oportunitate, un tipping point pentru bancă. Post-pandemie nu avem cum sa ne întoarcem la comportamentul financiar pe care îl aveam până în februarie a.c. Relaţia românilor cu online-ul s-a schimbat. In plus, cardul fizic se va dematerializa. Vom asista la o scădere a cererii pentru cardurile fizice, respectiv la o creştere a preferinţei pentru componenta digitală a acestora.”