Tech & Science

How YouTube Censorship Bots Are Crashing Creator Careers

From July to September 2018, Google has taken more than 58 million videos and deleted 224 million comments off of YouTube. This rise in self-regulation is in part due to mounting efforts by government officials and influence groups who believe that the platform has been overrun with extremist and controversial content. It’s no small feat. There are eight hours of content posted to YouTube every minute, making human review virtually impossible.

The YouTube algorithm is a constantly evolving set of numbers and code that automatically skims through keywords, headlines and even noises to determine if the content follows the rules. Videos that break YouTube’s terms of service receive strikes, multiple strikes mean suspensions, takedowns or permanent bans. Content creators can file an appeal for manual review of these decisions, but any downtime at all means lost revenue. And for smaller channels this can be a big deal, especially when the algorithm makes a mistake. Just ask NicktheSmoker.

In the beginning of December, YouTube implemented an algorithm change that attempted to cull  more “risque” content. NicktheSmoker has been a YouTuber for four years, reviewing cigarette brands from his car or apartment. “I’m not an advocate of smoking or underage tobacco/vape use,” Nick told Newsweek. “A lot of my viewers don’t even smoke but they say my videos are soothing, almost therapeutic.”

On December 6, Nick woke up to a suspended channel and four different community guideline strikes. The YouTube support website says strikes are issued “to videos that contain nudity or sexual content, violent or graphic content, harmful or dangerous content, hateful content, threats, spam, misleading metadata, or scams.” Receiving three strikes within a three-month period terminates your channel, but this was the first time Nick received any strikes. Still, his channel was gone.

Larger YouTubers, like commentary channel LeonLush, rushed to Nick’s aid to show support. YouTube removed two of the strike’s from Nick’s channel, but left two up. That means the NickTheSmoker channel can’t upload for the next three months. (A YouTube spokesperson told Newsweek that in the striked videos, Nick linked to outside sites that sold “certain regulated or illegal goods,” which breaks TOS.)

NickTheSmoker is back up and running, but other channels have been wiped out in the purge.

Mumkey Jones (who asked not to use his real name) has been making videos on YouTube under multiple accounts for the past nine years. The Mumkey Jones channel found success with an “edgy sense of humor” that “appealed to teenagers,” the content creator told Newsweek. His most successful videos center around Elliot Rodger, the 22-year-old mass shooter who killed six people near the University of California campus in Isla Vista, California. Rodger identified as an “incel,” a term meaning involuntary celibate and moniker for members of misogynistic online communities. Fueled by his hatred for women, and the “normies” and “Chads” he believed they were attracted to, he had a “three phase plan” that ended with him stabbing three people and going on a shooting spree. His digital footprint was massive, posting multiple videos and a 137-page manifesto detailing his life of loneliness, rejection and “virginity.”

“When the Elliot Rodger story first broke in 2014, I was in an online forum that dug up his cringey videos and manifesto, and we couldn’t believe what we were seeing,” Jones said. “It was fascinating to me that this soft-spoken, introverted, embarrassing kid actually committed a massacre.”

Jones consumed all of this — the manifesto, dozens of YouTube videos and even Rodger’s hacked email account — to create dozens of videos with titles like “I Pretended to be Elliot Rodger on BetterHelp”, “The Day of Retribution - Elliot Rodger vs Reality” and “Elliot Rodger's Anime Reviews.” Each video examined some facet of Rodger’s personality, like the fact that he loved lottery tickets and would drive across the country because he believed he could win. In Jones’ eyes, he “used comedy to drive home the point that Elliot Rodger isn’t somebody to aspire to. He was a sad loser who killed innocent people because he didn’t know how to talk to girls.”

“If we aren’t allowed to make fun of spree killers, then who are we allowed to make fun of?” Jones said.

On December 11, the algorithm caught up to Jones.

He suddenly received six new strikes on his YouTube channel. Five of these were on videos that had “Elliot Rodger” in the name, including a private one with zero views titled “elliot.amv.” It featured a parody song of Rodger’s final days with lyrics like “I still recall all the good times we had/ back in Isla Vista, you were my virgin, I was your Chad.” Jones feels the algorithm unfairly targeted him based on titles alone, not on his content, which he maintains is nuanced and satirical.

“I was making fun of a hateful, murderous person, and the faceless YouTube algorithm assumed I was somehow endorsing his hatred,” he said.

(A spokesperson for YouTube said his channel was flagged by users and that he received the strikes for “violating long standing policies against hate speech and glorifying violence, including glorifying a shooter and promoting/inciting hatred towards a protected group of people.”)

For days after the ban, Jones was unable to get into contact with an actual YouTube representative. “Sometimes you have to rely on software. And sometimes that software goes crazy and deletes a man’s career,” Jones said. For now, Jones is staying away from YouTube and thinking of moving his content to a competing site, like Twitch. His thousands of fans, which have been flooding Twitter with messages of support, are sticking by his side.

After days of silence, Jones finally received a response from the Team YouTube Twitter account. The private video violated YouTube’s policy on “harmful/dangerous content.” Another tweet elaborates : “we don't allow content that appears to be posted in a shocking, sensational, or disrespectful manner. Including promoting, glorifying or encouraging violence (e.g. mass shootings).”

Jones believes that YouTube has changed from “a platform intended for the common man to make a name for himself to a yet another platform for celebrities.” Regardless if you disagree with Jones’ content (and there’s a lot to find distasteful) it’s clear that YouTube is moving away from its roots. Channels that push limits, that discuss complicated subjects and conversations, are now struggling on a platform they helped legitimize as it props up new stars and celebrities.

UPDATE: A YouTube spokesperson asked us to add that with Nick and Mumkey, both instances were manually flagged by a YouTube employee. The algorithm may have sought them out, but the flags were manually applied.

Earlier this week, Felix “PewDiepie” Kjellberg recommended some smaller channels that his 77 million subscribers should check out. One was “e;r”, which makes anime reviews coupled with extremist iconography and anti-semitic rhetoric. The media backlash shocked the YouTuber, who told his fans news sites were blowing this whole situation out of proportion. His fans stood by him, bombarding the reporters who published these stories and hacking the Wall Street Journal website to “apologize to PewDiePie.”

Creators who make content that isn’t as advertiser-friendly as Will Smith bungee jumping out of a helicopter or James Corden singing in his car are having a rough time surviving on the platform. These content creators who have dedicated their lives to their channels and fostering their communities can be wiped away overnight.

“People like me were lucky that we were still able to find great success on YouTube in the year 2018, but if my channel termination is any indication, it looks like they wanna clean house for 2019,” Jones said. “All hail Will Smith. All hail Jimmy Kimmel.”

Editor's Pick