Tech & Science

YouTube Struggles With Child Porn Moderation: Bans Channels By Mistake While Letting Offenders Slide

On Saturday morning, nearly a dozen YouTubers woke up to find their channels wrongly terminated. These content creators have very little in common; Trainer Tips and Mystic7 make Pokemon Go videos, Billiam and Vailskibum94 do video essays while JustFlippers and Sqaishey Quacks dedicated their time to child-friendly content. All of these content creators received emails saying that years-old videos “were flagged for review” and that they could receive strikes. Some said they had their channels and Google accounts deleted entirely, taking almost a day for their channels to be reinstated.  

YouTubers Banned for CP

There is one common thread that ties all of these channels together. Every one of them has included in the title or tags (YouTube keywords) the term “CP,” which the numbers and code inside YouTube’s algorithm seemed to have mistaken for “child porn.” (YouTube did not comment to Newsweek on what caused the bans or if they were indeed about minor sexual content.)

In Pokemon Go “CP” refers to Combat Power, which the pair of Pokemon YouTubers used a lot. In a video posted on Mystic7’s second YouTube content, the influencer expresses his distress over the sudden and unexpected removal of his channel. “It’s literally our livelihoods that have been shut down unfairly,” Mystic7 said. An email sent from a YouTube moderation bot said that “sexual content involving minors is particularly sensitive.” The video in question featured Mystic7 walking around town catching pocket monsters; there wasn’t a single moment of “sexual content.” Both Pokemon Go YouTubers had their channels reinstated within hours of tweeting or making videos about the issue.

The other channels affected all made videos about the video game Club Penguin. Just Flippers dedicated his channel to the game and told Newsweek he is very proud of the 700 subscribers he’s amassed over two years. For a full day, he couldn’t access his channel or Gmail account to find out what exactly was wrong. “I think YouTube should really fix their automated flag system because they are just deleting channels without warning,” Flippers said. “I used ‘CP’ in most of my videos but YouTube doesn't understand that it can mean a lot of things.”

Just Flippers, along with Billiam and the rest, had their channels reinstated after a full day. That’s 24 hours of their job disappearing with little response from their employer, YouTube. Some of the content creators affected that Newsweek spoke to were afraid that the algorithm, in it’s ironic wisdom, would not recommend their videos as much since their views had disappeared.  

“Some channels have been temporarily affected due to the nature of the detection software,” Billiam said. “I mean clearly implementing code to detect when people upload child porn is good, but no one came out and said the issue would be resolved. Multiple channels were publicly contacting the Team YouTube twitter account, and no one received a response which acknowledged the common issue between them.”

In a statement to Newsweek, a YouTube spokesperson said "with the massive volume of videos on our site, sometimes we make the wrong call. When it's brought to our attention that a video or channel has been removed mistakenly, we act quickly to reinstate it. We give uploaders the ability to appeal these decisions and we will re-review the videos."

While these channels struggled with the algorithm, others managed to abuse it to share what YouTube has been trying so hard to combat.

YouTube’s Battle With the Sexual Exploitation of Minors

Just one day after the “CP” bans, YouTuber MattsWhatItIs uploaded a video titled “Youtube is Facilitating the Sexual Exploitation of Children, and it's Being Monetized (2019).”  The 12-hour long video has amassed more than half a million views and a front page reddit post that’s been gilded multiple times. The video consists of what Matt claims “is a wormhole of a softcore pedophile ring” consisting of multiple channels and videos dedicated to underage kids in compromising positions.

Starting out with a fresh YouTube account on a VPN, Matt searches on YouTube for the term “bikini haul.” As he clicks on more videos, the algorithm starts to recommend what it believes will increase your watch time. “It algorithmically decides the content you should be watching, so you stay on the website and watch more videos, you’ll see advertising and YouTube will generate more revenue,” Matt says in the video.

None of the clips featured are of sexually explicit content, but are of girls stretching or doing gymnastics. These videos can rack up tens of thousands or millions of views and aren’t intentionally malicious. Viewers might not be as innocent. 

Most have comments disabled, but others are filled with sections of sexually explicit admiration by strangers or time stamps to the girls in compromising positions. “Report this video, sorry little lady but not everyone on here has good intentions,” wrote one commenter on one of the clips mentioned in Matt’s video essay. Some of these videos have been gathered and reposted to YouTube with advertising enabled, allowing the content to make money off creepy clickers.

YouTube screenshot sexually explicit child porn video A screenshot from MattsWhatItIs latest video YouTube

Rumors of a “pedophile ring” on YouTube have been circling the web for years. In 2017, The Times of London found that brands like “BT, Adidas, Deutsche Bank, eBay, Amazon, Mars, Diageo and Talktalk” all have ads on videos “showing scantily clad children.”

YouTube does have a very strict policy against child pornography. The YouTube community guidelines says that “safeguarding the emotional and physical well-being of minors is a priority for YouTube.” If a report is sent to YouTube and a moderator confirms it seeks to sexualize or exploit minors, the channel and videos are terminated and the content is reported to the National Center for Missing and Exploited Children or the proper authorities. Only a fraction of a percent of reported of content is CSAM (Child Sexual Abuse Material).

A recent blog posted by Google outlines how YouTube is “toughening it’s approach” on family friendly content. It claims ads are being removed from channels who are inappropriately targeting families and that comments featuring inappropriate discussions about minors have been blocked. “Across the board we have scaled up resources to ensure that thousands of people are working around the clock to monitor, review and make the right decisions across our ads and content policies,” the post said.

In a statement to Newsweek about the video, a YouTube spokesperson said “any content - including comments - that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We enforce these policies aggressively, reporting it to the relevant authorities, removing it from our platform and terminating accounts. We continue to invest heavily in technology, teams and partnerships with charities to tackle this issue.”

Having content that’s algorithmically decided based on your personal habits is still a new industry, with new problems. YouTube is becoming a place that can be exploited by those with bad intentions, even as the company actively takes measures against them.

Editor's Pick