Nearly Half of Social Media Users Think Platforms Should Remove QAnon Content, Poll Shows

Almost half of social media users say social media platforms should remove content relating to conspiracy theories such as QAnon, according to a new poll.

QAnon is a far-right conspiracy theory that believes a number of of high-profile members of the Democratic Party are part of a satanic cult involved with child sex trafficking. They also claim that the secret group of sex traffickers and pedophiles are attempting to wage war against President Donald Trump from inside the government.

Conducted by Morning Consult, the study found that 47 percent of social media users think QAnon content should be removed from platforms. About 13 percent believe the content shouldn't be taken off or issued a warning.

The study surveyed 2,073 social media users from October 6 to October 8, and reported a margin of error of plus or minus 2 percentage points.

The poll's findings come shortly after YouTube announced stricter rules in "efforts to curb hate and harassment by removing more conspiracy theory content used to justify real-world violence."

According to a blog post published Thursday, YouTube announced that it was expanding both its hate and harassment policies "to prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence."

"One example would be content that threatens or harrasses [sic] someone by suggesting they are complicit in one of these harmful conspiracies, such as QAnon or Pizzagate," said the company in its blog post.

YouTube also noted that nearly two years ago, it undertook a "major step" to minimize the reach of harmful information by updating their recommendations system.

"This resulted in a 70% drop in views coming from our search and discovery systems. In fact, when we looked at QAnon content, we saw the number of views that come from non-subscribed recommendations to prominent Q-related channels dropped by over 80% since January 2019," YouTube wrote. "Additionally, we've removed tens of thousands of QAnon-videos and terminated hundreds of channels under our existing policies, particularly those that explicitly threaten violence or deny the existence of major violent events."

In an email sent to Newsweek, YouTube noted that its policies are content-based and not speaker-based, meaning that any content—such as videos, channels and comments—that is removed is based on what's said and not who said it.

According to the email, the announcement does not mean that the platform hasn't taken action against QAnon—as YouTube has continued to invest in the necessary policies, resources and products to protect its community from harmful content.

Additionally, Facebook and Twitter have taken similar steps in combatting the spread of conspiracy theories such as QAnon on their respective platforms.

QAnon
A woman holds up a QAnon sign to the media as attendees wait for President Donald Trump to speak at a campaign rally at Atlantic Aviation on September 22, 2020 in Moon Township, Pennsylvania. Jeff Swensen/Getty

Despite the efforts social media sites have taken to curb the spread of QAnon content, the Morning Consult poll found that 25 percent of social media users and 24 percent of Americans who have heard of QAnon believe the conspiracy claims are at least somewhat accurate.

According to the study, nearly 40 percent of Republicans also said that the claims are somewhat accurate, with 21 percent saying they are very accurate.

Nearly Half of Social Media Users Think Platforms Should Remove QAnon Content, Poll Shows | U.S.