Democrats Say Facebook 'Knowingly' Allowed Extremists To Promote Capitol Attack

A collection of Democratic representatives has called on Facebook to do more to prevent far-right extremist organizing on the world's largest social media platform, accusing CEO Mark Zuckerberg and his team of "knowingly" hosting content promoting the January 6 attack on the U.S. Capitol.

Energy and Commerce Committee Chairman Rep. Frank Pallone, Jr., Oversight and Investigations Subcommittee Chair Rep. Diana DeGette, Communications and Technology Subcommittee Chairman Rep. Mike Doyle, and Consumer Protection and Commerce Subcommittee Chair Rep. Jan Schakowsky wrote to Zuckerberg on Tuesday, demanding more information on Facebook's anti-extremism measures.

The letter cited multiple reports that even Facebook staff were concerned about the "rapid spread of extremism and disinformation" on the platform, but their fears were ignored by Zuckerberg and other executives.

"This deadly attack on the Capitol laid bare the dire consequences of hyperpolarization and extremism in our current political discourse—much of which is occurring on your platform," the four lawmakers wrote.

"With more than 3 billion monthly users across different services, Facebook must play a leading role in lessening the divide and lowering the temperature," they added.

Several social media platforms have come under increasing scrutiny in the aftermath of the January 6 Capitol attack, and amid rising concern over the threat posed by far-right domestic terrorism. Extremists are using social media platforms like Facebook, Twitter, YouTube, Telegram and others to radicalize other users, recruit members and even plan attacks.

The January 6 protest in Washington, D.C. was organized online, with activists making it clear their goal was to block the certification of President Joe Biden's November electoral victory. Others publicly incited violence against lawmakers, law enforcement and leftist activists in the run-up to the event.

Concerns over extremist activism on Facebook are not new. The platform—and its competitors—have long been criticized for allowing disinformation and propaganda on their sites, a problem magnified by former President Donald Trump's political rise and time in office.

The platform has been forced to act by public and political pressure. Militant groups like the Oath Keepers and conspiracy theory pages linked to QAnon were removed before the violence in Washington, D.C., but critics have argued that the platform did not do enough to stop broader incitement.

The letter from the committee chairs to Zuckerberg lists several instances in which warnings were ignored. A Facebook researcher said in 2016 that the platform's recommendation tools were directing people towards extremist groups, and in 2018 senior executives said the site's algorithms were encouraging "divisive content" in order to maximize user engagement.

And in 2020, just months before the Capitol was stormed, executives were told that "blatant misinformation and calls to violence" were swamping the platform's most popular "civic" groups, and warned of the need for action "to stop these conversations from happening and growing as quickly as they do."

The lawmakers said Zuckerberg and Joel Kaplan, Facebook's vice president of global public policy, were among those who "regularly balked at implementing reforms."

After the January 6 attack, the platform extended its block on Trump's Facebook and Instagram accounts. In a statement, Zuckerberg said the "shocking" events meant "the risks of allowing the President to continue to use our service during this period are simply too great."

"Over the last several years, we have allowed President Trump to use our platform consistent with our own rules, at times removing content or labeling his posts when they violate our policies," he said. "We did this because we believe that the public has a right to the broadest possible access to political speech, even controversial speech."

Facebook also said it was removing content praising the Capitol storming or inciting further violence.

Facebook has taken action it says will stem the spread of extremist rhetoric online. Its "Common Ground" task force was set up to roll back online polarization, while its "Integrity Teams" are charged with stopping the tide of disinformation and propaganda spread on the site.

But, the lawmakers said, any initiatives proposed by these teams first have to clear Facebook's internal "Eat Your Veggies" vetting process, made up of representatives from the platform's "public policy, marketing, and public relations teams, among others," the letter says.

The committee chairs are demanding more information on reports that Kaplan and others blocked or weakened proposed initiatives via the vetting process.

In response to Tuesday's letter, a Facebook spokesperson told Newsweek the platform has "banned QAnon and militia groups from organizing on our apps, designated more than 250 white supremacist groups, removed billions of fake accounts—and we're the only platform to build a global network of over 80 fact-checking partners to address misinformation."

"We make big decisions with input from people with different perspectives and areas of expertise and have used both external and internal research on the impact of polarization to change how we operate our platform."

This article has been updated to include a comment from Facebook.

Trump supporters inside the Capitol January 6
Supporters of President Donald Trump protest inside the U.S. Capitol on January 6, 2021, in Washington, D.C. Brent Stirton/Getty Images/Getty