Chinese researchers have discovered a terrifying vulnerability in voice assistants from Apple, Google, Amazon, Microsoft, Samsung, and Huawei. It affects every iPhone and Macbook running Siri, any Galaxy phone, any PC running Windows 10, and even Amazon’s Alexa assistant.
Using a technique called the DolphinAttack, a team from Zhejiang University translated typical vocal commands into ultrasonic frequencies that are too high for the human ear to hear, but perfectly decipherable by the microphones and software powering our always-on voice assistants. This relatively simple translation process lets them take control of gadgets with just a few words uttered in frequencies none of us can hear.
The researchers didn’t just activate basic commands like “Hey Siri” or “Okay Google,” though. They could also tell an iPhone to “call 1234567890” or tell an iPad to FaceTime the number. They could force a Macbook or a Nexus 7 to open a malicious website. They could order an Amazon Echo to “open the backdoor.” Even an Audi Q3 could have its navigation system redirected to a new location.
“Inaudible voice commands question the common design assumption that adversaries may at most try to manipulate a [voice assistant] vocally and can be detected by an alert user,” the research team writes in a paper just accepted to the ACM Conference on Computer and Communications Security.
Dolphin attack: inaudible voice command
Source: FastcoDesign
Banking 4.0 – „how was the experience for you”
„So many people are coming here to Bucharest, people that I see and interact on linkedin and now I get the change to meet them in person. It was like being to the Football World Cup but this was the World Cup on linkedin in payments and open banking.”
Many more interesting quotes in the video below: