Is Google Censoring Jesus? It Knows Buddha, Satan, Muhammad but Not Christian Savior

532148652
Google Vice President of Product Management Mario Queiroz shows the new Google Home. Getty Images

Google Home and Amazon's Alexa promise to provide answers to all of life's questions, like where you can order pizza or find pictures of cats. But now a man in Tennessee swears his electronic personal assistant doesn't know who Jesus is.

David Sams, a resident of a Nashville suburb, told local media that he asked his Google Home, "Who is Jesus Christ?" and got a variety of answers.

"Google knew who I was, but Google did not know who Jesus was. Google did not know who Jesus Christ was, and Google did not know who God was," Sams told journalists.

OK Google - Who is Jesus Christ? https://t.co/bt152L6Rkn

— anetia woodsmall (@AnetiaWoodsmall) January 26, 2018

The revelation sparked furor among the far-right on social media, which was quick to accuse the inanimate objects of being at the center of an Islamic conspiracy.

One YouTube user posted a video to demonstrate that his device can identify religious figures including Allah, Brahman, Krishna, Moses and Joseph Smith, but was unable to give a clear answer about who Jesus Christ is, responding, "Here are some results from the web." Other social media users have posted videos of virtual assistants identifying religious figures including Buddha and Satan.

"Google Home is eager to talk about Mohammed, but doesn't know who Jesus is... The Islamification of the West is well underway," Mark Collett, a self-described British political activist, tweeted.

Google Home is eager to talk about Mohammed, but doesn't know who Jesus is... The Islamification of the West is well underway. pic.twitter.com/Ij4sDVrzhp

— Mark Collett (@MarkACollett) January 26, 2018

Alex Jones, a conspiracy theorist who runs the radio show Info Wars, declared that Google is censoring Jesus.

Some social media users also turned on Amazon's Alexa, claiming that she had called Jesus a "fictional character."

Both Amazon's Alexa and Google Home are designed to answer a variety of questions, but many observers note that they still struggle to carry on a conversation and can sometimes misunderstand complex questions. These glitches have led to some mishaps.

For example, when a 6-year-old girl in Texas asked Alexa if she'd play dollhouse with her, the appliance ordered her a $170 dollhouse. In a separate incident, a toddler asked Alexa to play him a kid's song, and the virtual assistant played a song about pornography.

Editor's pick

Newsweek cover
  • Newsweek magazine delivered to your door
  • Unlimited access to Newsweek.com
  • Ad free Newsweek.com experience
  • iOS and Android app access
  • All newsletters + podcasts
Newsweek cover
  • Unlimited access to Newsweek.com
  • Ad free Newsweek.com experience
  • iOS and Android app access
  • All newsletters + podcasts