IMPULSE BUYING

Amazon’s Alexa heard her name and tried to order up a ton of dollhouses

Order whatever you want.
Order whatever you want.
Image: Reuters/Konstantin Chernichkin
We may earn a commission from links on this page.

Alexa is turning out to be a pretty bad listener.

Streaming songs, ordering pizza, and booking cabs are no-brainers for Alexa, the voice-activated assistant installed on Amazon Echo devices. But Alexa also unfortunately appears to enjoy engaging in a little unintentional retail therapy.

Recently, a six-year-old girl in Texas was able to order a $170 dollhouse and four-pounds worth of sugar cookies through Amazon’s Echo Dot. But at least in that case, the kindergartner was actually talking directly to Alexa.

On the morning of Jan. 5, California television channel CW-6 was reporting on the little girl’s purchases when it accidentally caused a slew of other Alexas to also attempt shopping sprees. During the on-air news segment, TV anchor Jim Patton said, “I love the little girl saying, ‘Alexa ordered me a dollhouse.'” Hearing the statement, Amazon Echoes in television viewers’ homes mistook the remark as a command, and many viewers complained that their personal assistants likewise tried to place orders for dollhouses.

Amazon says it is “nearly impossible to voice shop by accident” like in the Texas incident. “You must ask Alexa to order a product and then confirm the purchase with a ‘yes’ response to purchase via voice,” an Amazon spokesperson said in an email.  The company says that while a TV newscast may have woken up a bunch of Alexas, the orders would not have gone through without a secondary affirmation from the user.  It’s unclear whether the six-year-old in San Diego confirmed a dollhouse purchase by saying “yes.”

Ordering products by voice-command purchasing is a default setting on Alexa devices, so this means anyone listening in San Diego that morning with their TV volume turned up and their wireless speakers turned on could have become the new owners of a KidKraft Sparkle Mansion. But only if they also accidentally confirmed the accidental order Alexa heard on TV.

This dollhouse incident is more proof that Alexa is always listening. The device starts recording whenever it hears the wake word “Alexa,” recording sound for up to 60 seconds each time. (For this reason, authorities have recently tried to gain access to Alexa’s data in a murder investigation.) While that’s helpful, the feature arguably borders on invading privacy and has fanned overall security concerns that surround the rise of internet of things (IoT) devices.

Though encrypted logs of the recordings are kept on the company’s servers, the device’s microphone can be turned off, and recordings can be deleted manually from the account, many users are still worried about just how much Alexa is actually hearing. “Down the road, the technology will be more sophisticated where it will be able to identify certain individuals and register [the] people [who] can access it,” Stephen Cobb, senior security researcher for ESET North America, told CW6.

While the six-year-old’s surprise order has found a home with pediatric patients in a Dallas hospital, users don’t have to find a fix for accidental orders, as Amazon offers free returns. But to avoid such blunders all together, users can tweak their speakers, install a mandatory four-digit code to confirm orders, or can turn off the voice-controlled ordering feature completely through the Alexa app.

Correction: This story has been updated to reflect Amazon’s position on the San Diego story, and that the company confirms that Alexas may have woken up in San Diego but did not successfully order a bunch of dollhouses.