A red laser slices through the air, landing on the top of an Amazon Echo sitting inside a house. Suddenly, the garage door opens, a burglar slides in, uses another laser to have the Echo start the car and drives off.
Sound far-fetched? It’s not anymore.
Researchers from the University of Michigan have used laser lights to exploit a wide variety of voice-activated devices, giving them access to everything from thermostats to garage door openers to front door locks. The researchers have communicated their findings to Amazon, Google and Apple, which are studying the research.
Working with researchers from the University of Electro-Communications in Japan, U-M’s researchers published a paper and a web site detailing how it works. There are also videos showing it in action.
The researchers discovered the microphones in the smart devices would respond to light as if it were sound. Inside each microphone is a small plate called a diaphragm that moves when sound hits it. Using focused light, like lasers or even a focused flashlight, they were able to access the system.
This can create security issues, because while most of these devices are locked inside houses, light can travel through windows. It can easily travel long distances, limiting the attacker only in the ability to focus and aim the laser beam.
Researchers worked in a 110-meter long hallway and got a voice-activated system to respond. All the equipment needed to hack the system was available on Amazon.
The attack can be mounted using a simple laser pointer, a laser driver and a sound amplifier, researchers said on the website. A telephoto lens can be used to focus the laser for long range attacks.
Alexa as BFF: Can an AI robot or voice assistant help you feel less lonely?
Why’s that an emoji?: The ethos and birthing process behind the icons we use to communicate
So how does it work?
“Microphones convert sound into electrical signals,” the research says. “The main discovery behind light commands is that in addition to sound, microphones also react to light aimed directly at them. Thus, by modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio.”
In other words, the microphone reacts to the intensity of the laser light the same way it reacts to changes in pressure from sound waves.
So, a hacker can record their voice issuing a command, use a laser modulator to transform it into laser pulses and send it into a device, which then operates just like someone was talking to it.
So how do you stop it?
The most obvious way is to make sure your voice-activated devices are not in sight of a window. The devices can also be placed behind something, such as a bookcase, TV or picture. That’s because while light waves don’t, sound waves easily go around objects — meaning the device would still respond to a voice, said Benjamin Cyr, one of the researchers at U-M.
What won’t work is simply placing tape over the microphone. The researchers tested several devices with dirt shields over the microphone spot and the laser still worked.
There’s a bigger lesson here as well, said Daniel Genkin, another of the researchers.
“We need to do security by design,” he said, saying hackers are looking to exploit any vulnerability they can find.
Contact David Jesse: 313-222-8851 or firstname.lastname@example.org. Follow him on Twitter: @reporterdavidj
Read or Share this story: https://www.usatoday.com/story/tech/2019/11/08/alexa-amazon-echo-hackers-university-michigan/2529835001/