Former TikTok Moderator Sues Company for Allegedly Causing Her Mental Health Issues
A lawsuit filed by a former TikTok content moderator alleged that the company turned its back on her and thousands of other employees paid to watch countless hours of traumatic videos, negatively affecting their mental health.
Candie Frazier, who worked for third-party contracting firm Telus International, recently filed the class-action lawsuit against TikTok's parent company, ByteDance, in California Central District Court.
The lawsuit detailed how she allegedly worked 12-hour days that consisted of watching countless videos involving "thousands of acts of extreme and graphic violence," including mass shootings, child rape, animal mutilation, cannibalism, gang murder, and genocide.
Bloomberg reported that TikTok's 10,000 content moderators are routinely exposed to such content, and without much time to peel their eyes off their screens. They reportedly watch hundreds of videos throughout 12-hour daily shifts, receiving an hour off for lunch and two 15-minute breaks.
"Due to the sheer volume of content, content moderators are permitted no more than 25 seconds per video, and simultaneously view three to 10 videos at the same time," Frazier's lawyers said in the complaint.

According to the lawsuit, Frazier reportedly has trouble sleeping and nightmares are a common occurrence.
"While we do not comment on ongoing litigation, we strive to promote a caring working environment for our employees and contractors," a TikTok spokesperson told Newsweek. "Our safety team partners with third-party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally."
ByteDance could not be reached for comment.
In July of this year, TikTok Head of U.S. Safety Eric Han wrote of how the platform was working to better itself in terms of user safety, content moderation and amending community guidelines "to advance the safety of our community and integrity of our platform."
He also mentioned the company's employees.
"In addition to improving the overall experience on TikTok, we hope this update also supports resiliency within our safety team by reducing the volume of distressing videos moderators view and enabling them to spend more time in highly contextual and nuanced areas, such as bullying and harassment, misinformation, and hateful behavior." Han stated.
The Pew Research Center estimated that approximately 72 percent of Americans interact daily on social media, causing effects such as depression, anxiety and feelings of heavy isolation among users.
Even in 2015, a Common Sense survey estimated that teenagers spent about 9 hours per day online—a number so high that many of them at the time worried for their own well-being.
A Canadian study from 2017 piggy-backed on that data, asserting that even two hours per day of social media usage was enough to decrease mental health. A study from the United Kingdom argued that social media use disrupts sleep patterns.
The National Alliance on Mental Illness stated that "approximately 50 percent of insomnia cases are related to depression, anxiety or psychological stress."
Some researchers have even speculated that a constant bombardment of traumatic media exposure can lead to post-traumatic stress symptoms that can last years.
That includes a continued similar viewing experience by individuals who know the content is not healthy but still consume it anyway.
Frazier is reportedly seeking compensation for herself and other TikTok employees due to alleged psychological injuries, as well as a court order requiring the company set up a medical fund for moderators.