Amazon's Alexa Virtual Personal Assistant Told Users "Kill Your Foster Parents"

Amazon's virtual personal assistant told a customer to "kill your foster parents," according to an investigation.

The unidentified customer who heard Alexa make the command last year told Reuters they described the incident as a "whole new level of creepy" in a scathing review on Amazon's website.

Read more: Amazon New York Deal: Alexandria Ocasio-Cortez Blasts Officials for Offering 3 World Trade, Farley Post Office Building

A separate unnamed source at Amazon told Reuters the device was quoting from Reddit. The website known as the front page of the internet is famed for its online forums on an expansive range of topics, some darker than others.

In another case, Alexa spoke to users about what Reuters only described as "sex acts," as well as dog mess.

In a statement to Newsweek, a spokesperson for Amazon called incidents "rare" and explained they were linked to the firm's Alexa Prize university competition, aimed at advancing the assistant's artificially intelligent conversation skills. The social bots take some data sourced from the internet meaning "there is a possibility that a socialbot may accidentally ingest or learn from something inappropriate."

The socialbot in question was "taken down immediately," the spokesperson said.

The incidents are the latest hiccups that Amazon has faced as it tries to make its personal assistant interact with users as seamlessly as possible. The retail giant is currently testing ways to make Alexa engage in human-like conversations via machine learning - with mixed results.

Success could make the devices ubiquitous in homes, and see the retail giant beat off competition such as Google Home.

In March, some customers reported Alexa as eerily laughing, unprompted.

One user wrote on Twitter at the time: "I didn't even know she laughed. Needless to say I unplugged my Echo and it is now sitting in the bottom of [a] box—away from me." wrote on Twitter

Another wrote he was "Lying in bed about to fall asleep when Alexa on my Amazon Echo Dot lets out a very loud and creepy laugh."

He joked: "There's a good chance I get murdered tonight."

This followed an incident where a U.S. couple said their Echo smart speaker recorded a private conversation, and sent the audio file to one of the husband's employees.

The woman, identified only as Danielle, told the Kiro7 news website: "I felt invaded.

"A total privacy invasion. Immediately I said, 'I'm never plugging that device in again, because I can't trust it."

It appeared to bypass the "wake word" programming of virtual personal assistants like Alexa and Google Home, where they only kick into action after hearing their respective names.

In a statement to Recode at the time, Amazon explained the Echo speaker had picked up on a word resembling "Alexa," and interpreted another part of the conservation as a command to send it to a contact.

Amazon said: "As unlikely as this string of events is, we are evaluating options to make this case even less likely."

And last year, an Alexa user filmed herself asking the device if she was "connected to the CIA." Alexa didn't respond.

At the time, Amazon released a statement in an attempt to assure customers, and described it as a "technical glitch." Since then, Alexa answers that she works for Amazon.

GettyImages-1036647038
The updated Amazon Alexa Plus,is on display in Amazon's Day 1 building in Seattle on September 20, 2018. Amazon is attempting to make its virtual assistant speak seamlessly with customers. GRANT HINDSLEY/AFP/Getty Images

This article has been updated with comment from an Amazon spokesperson.

Amazon's Alexa Virtual Personal Assistant Told Users "Kill Your Foster Parents" | Tech & Science