TikTok Removed 49 Million Videos in 6 Months, U.S. Was Second Worst for Rule Breaking

TikTok removed more than 49 million user videos globally in the second half of 2019, according to a transparency report released today.

The short-form video platform, widely used by teens, said the trove of content scrubbed from the app actually accounted for less than one percent of clips uploaded by its users in that period, which spanned between July 1 and December 31, 2019.

The report revealed India had led the way in terms of largest number of removals, with 16,453,360 videos deleted. It was followed by the U.S. with 4,576,888, Pakistan with 3,728,162, the United Kingdom with 2,022,728 and Russia with 1,258,853.

TikTok systems caught and removed 98.2 percent of the clips before a user reported them, while 89.4 percent were taken down before they received any views.

The company said it received 500 legal requests for information from 26 countries in the second half of 2019, again the majority being traced to India (302 requests covering 408 accounts)—where the app is now banned over national security concerns.

TikTok, which is owned by Chinese tech firm ByteDance, said it received 45 requests to "remove or restrict" content from governments across 10 countries, yet again being led by India with 30 requests. China, it claimed, was not named in either category.

A spokesperson said: "We have never provided user data to the Chinese government, nor would we do so if asked. We do not and have not removed any content at the request of the Chinese government, and would not do so if asked."

While its parent company ByteDance is located in the city of Beijing, the version of the software that is available in mainland China is known as Douyin.

It remains unclear if the report's findings will calm the fears of U.S. politicians, who have called for TikTok to be investigated for any cybersecurity and privacy risks, saying it's possible Americans' data could be accessed by the Chinese government.

TikTok
This illustration picture taken on May 27, 2020 in Paris shows the logo of the social network application Tik Tok on the screen of a phone. MARTIN BUREAU/AFP/Getty

TikTok has repeatedly denied sharing data with China, stressing it stores all U.S. user data inside the United States, with a backup redundancy in Singapore.

According to the TikTok report, new content moderation infrastructure rolled out last year helped provide more insight into the reasons videos were being removed.

Based on the analysis, it said 25.5 percent of the user content removed in December fell under the category of adult nudity and sexual activities, while more than 24 percent had violated "minor safety policies," some videos being reported to authorities.

Content containing "illegal activities and regulated goods" made up 21.5 percent of the takedowns and 15.6 percent violated its suicide, self-harm, and dangerous acts policy. Less than one percent had content violating policies on hate speech, it said.

"As our young company continues to grow, we're committed to taking a responsible approach to building our platform and moderating content," head of U.S. public policy Michael Beckerman and head of safety Eric Han said in a joint statement.

"We're working every day to be more transparent about the violating content we take down and offer our users meaningful ways to have more control over their experience, including the option to appeal if we get something wrong. We'll continue to evolve our transparency report to provide greater visibility into our practices and actions."