Hackers can take control of intelligent assistants using high frequency sounds.
(CCM) — Intelligent assistants such as Amazon's Alexa and Apple's Siri have been hacked using high frequency sounds that are inaudible to humans, according to a New Scientist report.
Using the technique, hackers can make phone calls, post on social media, and disconnect wireless service, according to the report. More worryingly, it also enables them to make the devices open a malicious website and download malware, or start a voice or video call to listen in or spy on the device's surroundings.
The hack was discovered by a team from Zhejiang University in China, and it works on Amazon's Alexa, Apple’s Siri, Google Now, Samsung S Voice, Microsoft Cortana, and Huawei HiVoice. It also works on some voice recognition and control systems used in cars.
"If all a voice assistant could do was set an alarm, play some music, or tell jokes, then there wouldn’t be much of a security issue,” Tavish Vaidya, a security researcher at Georgetown University, told the New Scientist. "But voice assistants are connected to an increasing number of services, ranging from smart thermostats to internet banking, so any security breaches are pretty serious."
Vaidya added that the hack could be prevented if these intelligent assistants are programmed to ignore any sounds which fall outside the human audible frequency range.
Image: © Kheng Ho Toh – 123RF.com