in

Siri, Alexa, and Google Home can be hacked with a laser

[ad_1]

In brief: Researchers have found a novel way to hack into smart devices that uses lasers. While that seems unimaginable, it turns out a lot of devices have MEMS (micro-electro-mechanical systems) microphones that can be used to send commands from hundreds of feet away, so long as there’s a clear line of sight.

For the last few years, tech giants like Amazon, Google, Apple, and others have been busy building smart devices that can be controlled by voice. Naturally, a lot of people have questioned the privacy and security implications, and even more so after news broke that Amazon and Apple used thousands of workers to listen to digital assistant conversations. In the case of Amazon, Alexa transcripts can dwell on its servers long after a user requested their deletion.

Recently, researchers at the University of Michigan and at the University of Electro-Communications of Tokyo have discovered a new type of attack that once again proves that smart devices come with significant privacy risks. They found that by pointing a powerful laser at a mobile device or a smart speaker and quickly varying its intensity made their sensitive MEMS (micro-electro-mechanical systems) microphones pick it up as if it were sound.

This photoacoustic effect essentially allowed the researchers to beam “light commands” to voice-controlled hardware from as far away as 361 feet (110m). And since smart speakers are built for convenience, more often that not there’s no additional security measures in place before they can receive those commands. That means an attacker can use digital assistants to make online purchases, unlock door locks, garage doors, or locate a connected car such as a Tesla.

The devices used in the experiments include Google Home, Amazon’s Echo lineup, Facebook Portal, iPhone XR, iPad 6th gen, Google Pixel 2 and Samsung’s Galaxy S9.

It’s worth noting there are some limitations, such as the need to precisely target a device’s microphone and a clear line of sight. However, the researchers note that while such an attack isn’t easy to pull off, the parts needed can cost as little as $600 and are readily available on Amazon. And judging by the demo videos above, this attack method works surprisingly well even when the laser beam has to pass through a window.

The good news is there are some protections in our devices that can prevent light commands from working, such as requiring a device unlock for advanced commands and wake-up words that carry a user’s voice signature. And even as companies are doing everything in their power to make voice assistants more useful, adoption rates aren’t that high.

The researchers think manufacturers can greatly improve security by using multiple microphones and shielding them from light. In any case, it will be interesting to see how tech giants respond to the findings.

[ad_2]

Source link

Lonely cardiac patients at increased risk of death within year of hospital discharge — ScienceDaily

Tara Sutaria Was Asked If She’s Dating Aadar Jain. She Said, ‘He’s Special’