TikTok Faces Huge Test as Graphic Video Floods Platform Popular With Teens

A graphic beheading video that went viral on TikTok highlights flaws in its moderation system and could spark major challenges for the company's fight to prove its suitability for minors, experts say.

The now-removed video initially duped viewers into thinking they were watching a typical TikTok dance video, only to cut to an explicit clip of a man being beheaded in a bathroom.

In a statement to Newsweek, a spokesperson for TikTok said: "We appreciate the concerted effort by our community to warn about an unconscionably graphic clip from another site that was spliced into a video and brought onto TikTok.

"The original video was quickly removed, and our systems are proactively detecting and blocking attempted re-uploads of the clip to catch malicious behaviour before the content can receive views. We apologize to those in our community, including our moderators, who may have come upon this content."

This specific video has since been added to TikTok's "Hashbank" system, which now automatically detects it before it's posted.

For TikTok, this video could prove potentially damaging in terms of the trust of app users and its relationship with them.

"Any time something so graphic and horrifying is posted and disseminated online on a platform like TikTok that is used by so many, especially teens and younger generations, it's going to have major effects on how users feel about the platform," explained Attila Tomaschek, a digital privacy expert and researcher at ProPrivacy.

"Particularly because the video deliberately tricks users into clicking on it by claiming its subject matter is something altogether different, users are feeling apprehensive about using the app and clicking on a seemingly innocent video, only to be instead insidiously bombarded by something unequivocally gruesome."

A 20-year-old TikTok user, Ry, told Newsweek they skipped using the app while the video was there, and wants more action in place to prevent it again.

"I decided to avoid TikTok, not wanting to see death on an app for entertainment. I avoided it for a few days before realizing the video was seemingly removed from the platform."

"I hope for a better filter as to what is posted on TikTok. I would like to know that videos are being actively monitored, and that videos with extreme violent content are taken down as soon as possible," they added.

A 16-year-old TikTok user, Justin, saw the video, and was left "traumatized," by it, leaving his "hands shaking" and "heartbeat still for a second."

"Currently, I'm not watching TikTok that often anymore. I'm just in my 'Following' page so I only see videos from people I follow," he said, commenting on how the relationship with TikTok was changed by the video.

Nickolas Jones, a postdoctoral scholar at the Sue & Bill Gross School of Nursing at the University of California, has researched extensively into graphic content online, and told Newsweek: "It certainly can vary from person to person, but our research suggests that exposure to graphic content is associated with increased psychological distress both cross-sectionally, and over time."

"This means that for some people, seeing graphic imagery like beheadings may incite psychological distress in the moment and sensitize them in a way that might make future exposures to graphic content even more distressing."

Jones added that he believes accidentally watching a graphic clip, like a spliced one, can have a difference on the impact too, "although it's not super clear how different intentional versus accidental exposures are. If you are watching a puppy video and are suddenly confronted with a graphic assault of a human body, you'll probably be horrified and upset by it. Depending on how potent the imagery is, you might have a hard time blocking it from your mind."

Although TikTok responded sternly to the crisis, the video managed to reach a wide audience prior to its removal, many of which have claimed online to have been left "traumatized" by it and many, like Justin, were minors.

"Because TikTok is geared towards a younger audience than most other social media platforms, its responsibility in containing the dissemination of these types of videos is inherently greater than it is for others," said Tomaschek.

"TikTok needs to remain a safe environment for younger generations to share videos and communicate with one another online. When such graphic content weasels its way into the mix, then the integrity of that safe environment is compromised. TikTok has a massive responsibility to prevent this type of content from ending up on its platform and to remedy the situation quickly if and when it does," added Tomaschek.

In recent months, TikTok has actively attempted to reform its image as a teenager-friendly app. As of January, under-16 users automatically have their accounts set to private when created, and their videos can't be downloaded by other users.

Moreover, 13-15 year olds also only have the option of "friends" or "no one" commenting on their content.

Gaia Beck, tech expert and founder of child-orientated app Chipping In, predicts that the recent moderation mishap by TikTok could unsurprisingly worsen its reputation in the eyes of parents.

"What's alarming is that it took nothing more than a short cover video to blindside their filters and let gratuitously gory content slip through. Any parent would ask themselves 'what else could be hiding behind seemingly innocent videos?'" she told Newsweek.

"For the video to be able to go viral before TikTok was even aware of its presence, gives the impression that they don't have the current measures in place to catch future offensive and explicit videos before it's too late. I'm sure many parents are now thinking twice before allowing their children to scroll so much uncensored content."

Of course, TikTok isn't the first, and likely won't be the last, social media platform to accidentally allow graphic content on its platform, but unlike other social media sites, TikTok's For You Page allows videos to reach viral-level much quicker and wider than them.

So how did such a graphic video manage to slip through TikTok's seemingly tight-knit net?

One TikTok content moderator told Newsweek that the spliced technique likely "tricked" the AI system in place. When a video is shared on the app, an AI server automatically moderates it, searching for content like nudity, death and gore. However, they told Newsweek that users reformat videos multiple times in order to finally get it past the AI moderation. Only when a video reaches 500 views is it then sent to be moderated by a human.

The recent viral video isn't the only one to move under the radar and get through the initial moderation, added the content moderator, as they often see "funny clips or dances" that when slowed down show nudity or or other guideline breaches.

"Not much can be done to prevent these with current systems. We try and stop the wrong ones, but there will be one offs. Humans make errors, but AI systems get tricked too," they add.

"They need to get this stuff fixed quickly," summarised Justin, echoing the thoughts of many TikTok users.

Update 06/09/2021 8:40 a.m. ET: This article was updated to include comments from Nickolas Jones.

TikTok logo against black screen
A picture taken on January 21, 2021 in Nantes, western France shows a smartphone with the logo of Chinese social network Tik Tok. TikTok could face problems after a graphic video made it past initial moderation. (Photo by LOIC VENANCE/AFP via Getty Images) Loic Venance/Getty Images