The team published a document detailing the light defect after seven months of experiments. They were able to distract smart speakers from 230 to 350 feet away by focusing lasers with a telephoto lens. In fact, Google Home, which they cheated to open a garage door, was inside a room in another building. The laser modulation that emitted on his microphone port through the window is equivalent to the voice command "OK Google, open the garage door."
They explained that there was a small plate called the diaphragm in the microphones of the devices moving at beats by sound. Lasers can repeat this motion and convert it into electrical signals that the device can understand. They said that opening a garage door by switching Google Home is easy to make, and they could easily make online purchases, open doors, protected by smart locks, and even remotely unlocked vehicles connected to voice AI devices using the same method.
Researchers have already notified Tesla, Ford, Amazon, Apple and Google about the problem ̵
This is far from the first discovered digital assistant security researchers. Researchers at Zheijiang University of China found that Siri, Alexa and other voice assistants could be manipulated with commands sent in ultrasonic frequencies. Meanwhile, a group from the University of California, Berkeley, has discovered that they can capture intelligent speakers by embedding commands that are not heard in the human ear, directly in music or spoken text.