Facebook Whistleblower Frances Haugen Links Company to Capitol Riot

A former Facebook employee has blamed the social media company for helping to fuel the January 6 insurrection by choosing to amplify hate and misinformation on the site.

Frances Haugen, a data scientist who leaked tens of thousands of pages of internal documents, filed complaints with federal law enforcement and shared research with The Wall Street Journal suggesting that Facebook is lying about the progress they are making against hate, violence and misinformation, revealed herself in an interview on CBS' 60 Minutes on Sunday.

During the interview, Haugen said Facebook "over and over again, has shown it chooses profit over safety" and that the company's decision to improve user activity, and therefore increase revenue, has actively caused real-life harm.

Haugen joined Facebook in 2019 and worked as a product manager on its Civic Integrity team, which aimed to fight misinformation during and after the 2020 election. She told 60 Minutes that she wanted to work in an area of the company that combats misinformation because she lost a friend to online conspiracy theories.

However, shortly after the election was declared in November 2020, Facebook decided to shut down Civic Integrity, which Haugen said was a major turning point ahead of the January 6 attack.

"They basically said, 'Oh, good, we made it through the election, there weren't riots, we can get rid of civic integrity,'" she said. "Now, fast forward a couple months, we got the insurrection.

"When they got rid of Civic Integrity, it was the moment where I thought 'I don't trust that they're willing to actually invest what needs to be invested to keep Facebook from being dangerous.'"

Haugen added that the root of the problems of hateful content and misinformation being spread on Facebook can be traced back to 2018, after they changed their algorithm to determine what people see on their feeds.

Haugen said that Facebook altered the algorithm so that posts with lots of engagement are pushed onto people's feeds.

Haugen claimed Facebook made this decision despite their own research showing that "content that is hateful, that is divisive, that is polarizing" gets the most engagement online because "it is easier to inspire people to anger than it is to other emotions."

"Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site. They'll click on less ads, they'll make less money."

Haugen said that while Facebook did turn on safety systems to reduce misinformation being spread during the 2020 election, these were only temporary and they then changed the settings back to how they were before, in order "prioritize growth over safety."

"That really feels like a betrayal of democracy to me."

While other social media sites and apps such as Parler and Telegram were found to have been used to organize the January 6 attack at the Capitol, none had as large an audience potential as Facebook. A number of prosecutors have cited Facebook as helping to organize the violence that day, which was carried out in part by far-right extremists.

During the interview, Haugen agreed with host Scott Pelley that Facebook "essentially amplifies the worst of human nature" but said that no one at the company is "malevolent."

"But the incentives are misaligned," she added. "Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction. And the more anger that they get exposed to, the more they interact and the more they consume."

Haugen is due to testify before Congress on Tuesday about the documents she leaked.

In a statement to 60 Minutes, Facebook said: "Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.

"If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago."

In a statement to Newsweek regarding the claim that the company's desire for profit outweighs efforts to keep the platform safe, a Facebook spokesperson said: "The growth of people or advertisers using Facebook means nothing if our services aren't being used in ways that bring people closer together—that's why we are investing so much in security that it impacts our bottom line.

"Protecting our community is more important than maximizing our profits. To say we turn a blind eye to feedback ignores these investments, including the 40,000 people working on safety and security at Facebook and our investment of $13 billion since 2016."

The spokesperson added: "We've invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority. If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.

"We have a strong track record of using our research—as well as external research and close collaboration with experts and organizations—to inform changes to our apps."

Frances Haugen facebook interview
Protesters gather outside the U.S. Capitol Building on January 6, 2021 in Washington, D.C. Pro-Trump protesters entered the U.S. Capitol building after mass demonstrations in the nation's capital during a joint session Congress to ratify President-elect Joe Biden's election victory over Donald Trump. A former Facebook employee said the social network turned off safeguards designed to thwart misinformation which helped contribute to the deadly attack at the Capitol. Tasos Katopodis/Getty Images

Update 10/04/2021 6:56 a.m. ET: This article has been updated with further comment from Facebook.