TikTokkers Are Getting Banned for Sharing QAnon Conspiracies As Anti-Trump Users Get Videos Removed

TikTok is shutting down the accounts of users that post QAnon-related content to the platform, as well as removing political videos from users who didn't declare that they were paid for by a marketing company.

The video-sharing social network, which is primarily used by teenagers, has become a political battleground in the lead-up to the U.S. presidential election on November 3.

The far-right, disproven QAnon conspiracy theory alleges that Democrats and Hollywood elites are members of a cannibalistic secret society, which traffics children. According to the conspiracy theory, President Donald Trump is on a secret mission to take the deep state down.

Both Twitter and Facebook have banned accounts that support QAnon, and TikTok is doing the same. The company is also redirecting hashtags that are being used to spread the conspiracy theory to its community guidelines.

"Content and accounts that promote QAnon violate our disinformation policy and we remove them from our platform. We've also taken significant steps to make this content harder to find across search and hashtags by redirecting associated terms to our Community Guidelines," a TikTok spokesperson told Newsweek.

"We continually update our safeguards with misspellings and new phrases as we work to keep TikTok a safe and authentic place for our community."

TikTok has been banning accounts associated with QAnon since August, but the initial announcement passed somewhat under the radar.

TikTok is also removing a series of political videos, many of which are anti-Trump, from its platforms, as the users that posted them did not disclose that the content was paid for by a marketing company.

TikTok has a ban on political ads and requires paid-for videos to be clearly labeled as such, but a number of videos funded by Bigtent Creative, some of which racked up hundreds of thousands of views, were posted without Bigtent Creative's involvement being declared.

"These guidelines also apply to paid content by influencers, and we rely on influencers and marketers to follow [US Federal Trade Commission] guidelines," a TikTok spokesperson told the BBC.

"We remove paid influencer content that's not disclosed as such as we become aware of it and are now taking action on this."

A member of the Bigtent Creative team also said that the videos had non-partisan funding, and should not have included anti-Trump statements, the BBC reports.

qanon conspiracy theorist donald trump rally
A person wears a QAnon sweatshirt during a pro-Trump rally on October 3, 2020 in the borough of Staten Island in New York City. TikTok, Facebook and Twitter have started banning accounts associated with the conspiracy theorists. Stephanie Keith/Getty Images