Amazon Echo secretly recorded a family conversation and sent a copy without permission

Amazon Echo

Forget the eerie laugh. For a family in Portland, Oregon, an Amazon Echo device has become no laughing matter. Not only did one of the best-selling smart speakers in their home record their conversations, but it also sent those recordings to someone without the family’s permission. The alleged episode makes you wonder whether companies like Amazon are serious about privacy when it comes to smart home devices. 

According to KIRO-TV, the Oregon family loved Amazon devices and used them to control their home’s heat, lights, and security system. However, then something strange happened.

According to Danielle, who did not want KIRO-TV to use her last name, someone called her home in early May telling her to unplug her Alexa devices because “you’re being hacked.” He explained that he had received audio files of recordings from inside Danielle’s house. The person on the other end of the phone wasn’t a stranger, but one of her husband’s employees. His personal information was in Alexa’s contact list.

So, what happened?

In an email statement to The Washington Post, which picked up on the KIRO-TV story, Amazon explained that the Echo likely woke up when it heard a word that sounded like “Alexa.”

From there:

The subsequent conversation was heard as a ‘send message’ request. At which point, Alexa said out loud ‘To whom?’ At which point, the background conversation was interpreted as a name in the customers contact list.

They concluded: “As unlikely as this string of events is, we are evaluating options to make this case even less likely.”

This isn’t the first time Amazon’s voice assistant has been accused of listening to people without their permission. Amazon’s smart devices are supposed to record audio only after a user issues a voice command, known as the “wake word.” In April, however, researchers discovered a flaw in the Alexa voice assistant that enabled the device to continue listening and recording. After hearing from the researchers, Amazon fixed the vulnerability.

The best way to guarantee Alexa, Siri, Google Home, or any other voice assistant isn’t recording your conversation is to turn the device off when it’s not in use. Otherwise, there’s always a chance you’re being recorded.

Nonetheless, the story mentioned here sounds odd, at best. I’m not saying it didn’t happen, but it does seem like Alexa had to jump through a menacing number of hoops to do what the family says it did. Besides, even Amazon’s explanation sounds beyond reasonable, don’t you think?

Do you believe this story? Let us know in the comments below.