TikTok Limited Reach of Videos From Disabled Users, Documents Show

Social media application TikTok reportedly limited the reach of video content that was uploaded by users who appeared to have disabilities.

The moderation policy, outlined in documents obtained by Netzpolitik, was intended to protect users who were deemed to be "susceptible to harassment or cyberbullying based on their physical or mental condition" by limiting who could see their uploads, TikTok said.

The team were told to flag content from users who appeared to have autism, Down's syndrome or facial disfigurements.

The company told Newsweek that the policy has since been updated. "Early on, in response to an increase in bullying on the app, we implemented a blunt and temporary policy," a TikTok spokesperson said in a statement shared today via email.

"This was never designed to be a long-term solution, and while the intention was good, it became clear that the approach was wrong," the statement continued. "We want TikTok to be a space where everyone can safely and freely express themselves, and we have long since changed the policy in favor of more nuanced anti-bullying policies."

According to Netzpolitik, content from disabled users that had been flagged by moderators would only be visible inside the country where it was first uploaded.

And in some cases, vulnerable users whose content attracted a few thousand views would end up being listed as "not recommended." That meant the videos would no longer be chosen by the app's algorithms to appear on the main "For You" feed of public uploads.

TikTok users typically share short-form videos under a minute in length, meaning moderators would likely have had very little time to determine if a user was disabled.

Netzpolitik reported the policies were in place until September this year and also swept up users who appeared to be self-confident and overweight, or homosexual. The site's source, who was not named, said the rules came from bosses based in Beijing who did not appear to listen to complaints that the moderation policy was insensitive or discriminatory.

TikTok is owned by a Chinese technology company called ByteDance, but has increasingly been stressing its use of localized teams for more-recent moderation and data policies. The U.S. team is led out of California and creates tailored rules for the American userbase.

TikTok does not operate in China, although ByteDance has an equivalent service in the country called Douyin. TikTok, which reportedly has more than 1 billion active users, has come under scrutiny by U.S. politicians who recently launched a national security review into the company over its $1 billion purchase of the lip-sync video platform Musical.ly back in 2017.

Last month, the app was at the center of a censorship scandal when a teenage user was banned after uploading a video about China's brutal crackdown of the Uighur Muslim community. The TikTok account, @getmefamouspartthree, was later reinstated and an apology issued.

In a statement at the time, the company blamed a "human moderation error." Still, not everyone appears surprised the app is facing sudden growing pains.

"TikTok is speed-running the last decade of trust, privacy, safety and content moderation challenges faced by US user-generated content platforms," tweeted computer scientist Alex Stamos, who worked as the chief security officer at Facebook between 2015 and 2018. "What comes next?" he added, before warning: "Grooming, sextortion and human trafficking."

In this photo illustration, the social media application logo, TikTok is displayed on the screen of an iPhone on March 05, 2019 in Paris, France. Chesnot/Getty