App That Lets Kids Talk To Santa Claus Propositions 8-Year-Old Girl

A free app that purports to let children text and call Santa Claus is under fire for messages that propositioned a little girl.

Last Thursday, Ashley Adams and Justin Bell's 8-year-old daughter asked to download an iPhone app, "Santa Call & Text You." The app shows a cheerful Santa winking at the camera, and allows children to text, call or video chat with St. Nick.

"Only me, and her dad and a few other people know the password for her to download any type of apps or anything," Adams told the Gaston Gazette. "She wanted to download this Santa app because she wanted to talk to Santa. [Bell] was sitting right beside her and she was so excited to show the responses."

But after the little girl sent "Santa" a text to say "Hi," the app replied with "What are you wearing?"

santa claus check list
The app claimed to let children chat or call Santa, as well as download Christmas-themed emoji and stickers. Getty

At that point, Bell took the phone and asked why the app would ask such a question. He also called the police, who came over and helped Bell continue the conversation with the app.

Though many of "Santa's" responses were generic or garbled, it later asked "how old are you." Bell replied "8," and the app said, "You are too old for me."

The experience with the app dampened the little girl's enthusiasm for Christmas, and according to her mother, she is now afraid of Santa Claus.

"I feel like my daughter was violated because she didn't understand why Santa wanted to know what she was wearing," Adams told WGHP.

Reviews for the app say that the interactions with "Santa" are wholly automated by a bot, though Adams believes a human could be at the other end. Neither Apple nor the app's developer, Nguyen Thao, have commented about the incident.

Though it's unclear whether or not it was a human at the other end of the app asking inappropriate questions, it's possible for it to be a bot gone wrong. Many "chatbots" are built with rudimentary AI, allowing the bot to expand its pool of possible responses. Cleverbot, launched in 1997, was one of the first of these programs.

However, if the app has this rudimentary AI, it's possible for malevolent users to send objectionable messages to "teach" the bot to ask those types of questions.

One such incident happened in 2016, when Microsoft released a Twitter bot called Tay. Tay was intended to learn how to interact with people, but humans sent sexist and racist comments to the bot. Without any safeguards in place, the bot started spewing hate — and Microsoft had to shut down Tay within a day.