Frances Haugen: 5 Biggest Revelations From Facebook Whistleblower's '60 Minutes' Interview

Ex-Facebook employee Frances Haugen made a series of explosive claims about the company, including that it worked to "prioritize growth over safety," during a 60 Minutes interview.

In the course of the interview aired on CBS on Sunday, data scientist Haugen, 37, revealed herself to be the whistleblower who had shared documents with The Wall Street Journal.

Haugen said she left Facebook earlier this year after she became concerned with choices she claimed the company made, but not before secretly copying internal memos and documents.

These documents were shared with The Wall Street Journal, which released the information over the past few weeks that showed the inner workings of the social media giant.

Among the claims Haugen made about Facebook in the interview with host Scott Pelley is that the company harms its users and that it had removed safety systems to tackle misinformation.

Haugen had been part of Facebook's Civic Integrity group that worked on risks to elections that included tackling misinformation on the platform.

Facebook has since said the leaks were "misleading" and that the company had done more good than harm, and it continues "to make significant improvements to tackle the spread of misinformation and harmful content."

Here are five major revelations from Haugen's 60 Minutes interview:

Claim: Facebook doing little to combat hate speech and violence

Haugen shared a study she found, from earlier this year, where the company reportedly said: "We estimate that we may action as little as 3-5 percent of hate and about 6-tenths of one percent of violence and incitement on Facebook, despite being the best in the world at it.

She later added: "When we live in an information environment that is full of angry, hateful polarizing content it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other, the version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.

The ethnic violence referred to includes Myanmar in 2018 when the country's military used Facebook to promote genocide against its Muslim minority.

Facebook's response: "We've invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority.

"If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago. We have a strong track record of using our research—as well as external research and close collaboration with experts and organizations—to inform changes to our apps."

Claim: Facebook prioritized making money over users' safety

In a damning claim about her former employer, Haugen said that Facebook chose its own interests over those of its users.

Haugen said Facebook's own research had shown it optimized content that sparked engagement, or reaction.

She added the research showed "that is hateful, that is divisive, that is polarizing, it's easier to inspire people to anger than it is to other emotions."

Haugen continued to tell Pelley: "Facebook has realized that if they change the algorithm to be safer people will spend less time on the site, they'll click on less ads, they'll make less money."

Facebook's response: "The growth of people or advertisers using Facebook means nothing if our services aren't being used in ways that bring people closer together—that's why we are investing so much in security that it impacts our bottom line.

"Protecting our community is more important than maximizing our profits. To say we turn a blind eye to feedback ignores these investments, including the 40,000 people working on safety and security at Facebook and our investment of $13 billion since 2016."

Claim: Civic Integrity was disbanded shortly after the 2020 Presidential Election

Haugen said once the highly-charged 2020 Presidential Election was over, Facebook disbanded the Civic Integrity group.

It happened just months before the deadly riot at the Capitol, where supporters of outgoing President Donald Trump stormed Congress as lawmakers were approving President-elect Joe Biden's victory.

Haugen told Pelley: "They (Facebook) told us, 'We're dissolving Civic Integrity.' Like they're basically saying, 'Oh, good, we made it through the election. There wasn't riots. We can get rid of Civic Integrity now.' Fast forward a couple months, we got the insurrection. And when they got rid of Civic Integrity, it was the moment where I was like, 'I don't trust that they're willing to actually invest what needs to be invested to keep Facebook from being dangerous.'

Facebook's response: The goal of the Meaningful Social Interactions ranking change is in the name: improve people's experience by prioritizing posts that inspire interactions, particularly conversations, between family and friends—which research shows is better for people's well-being—and deprioritizing public content.

"Research also shows that polarization has been growing in the United States for decades, long before platforms like Facebook existed, and that it is decreasing in other countries where Internet and Facebook use has increased. We have our role to play and will continue to make changes consistent with the goal of making people's experience more meaningful, but blaming Facebook ignores the deeper causes of these issues —and the research.

Claim: Instagram contributes to harming teenage girls

During the interview, Pelley referred to an internal study carried out by Facebook that showed Instagram, a social media platform it owns, harmed teenage girls.

He added: "One study says 13.5 percent of teen girls say Instagram makes thoughts of suicide worse, 17 percent of teen girls say Instagram makes eating disorders worse."

In response, Haugen said: "And what's super tragic is Facebook's own research says, as these young women begin to consume this, this eating disorder content, they get more and more depressed.

"And it actually makes them use the app more. And so, they end up in this feedback cycle where they hate their bodies more and more. Facebook's own research says it is not just the Instagram is dangerous for teenagers, that it harms teenagers, it's that it is distinctly worse than other forms of social media. "

Facebook's response: "We do internal research to ask hard questions and find out how we can best improve the experience for teens and we will continue doing this work to improve Instagram and all of our apps.

"It is not accurate that leaked internal research demonstrates Instagram is 'toxic' for teen girls. The research actually demonstrated that many teens we heard from feel that using Instagram helps them when they are struggling with the kinds of hard moments and issues teenagers have always faced. This research, like external research on these issues, found teens report having both positive and negative experiences with social media."

Claim: European political parties ran negative campaigns due to Facebook's algorithm

Haugen claimed users interacted more with content that made them furious and added: "The more anger that they get exposed to, the more they interact and the more they consume."

It led major political parties in Europe to launch a complaint against Facebook. An internal report obtained by Haugen said the parties "...feel strongly that the change to the algorithm has forced them to skew negative in their communications on Facebook...leading them into more extreme policy positions."

She added parties believed if they did not take those positions they "won't win in the marketplace of social media."

According to 60 Minutes, Haugen is set to testify before Congress later this week and said the federal government should impose stronger regulations on the company.

Facebook's response: "We agree it's time for updated internet regulations and have been calling for it ourselves for two and a half years. Every day, we make difficult decisions on where to draw lines between free expression and harmful speech, privacy, security, and other issues, and we use both our own research and research from outside experts to improve our products and policies.

"But we should not be making these decisions on our own which is why for years we've been advocating for updated regulations where democratic governments set industry standards to which we can all adhere."

Facebook is one of the largest companies in the world and registered some 2.85 billion monthly users in March 2021, according to its investor report.

In a statement sent to 60 Minutes, Facebook said: "Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true."

It added: "If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago."

Facebook logos seen in Los Angeles.
A Facebook logo on a smartphone held in front of a computer screen in Los Angeles, California on August 12, 2021. Facebook whistleblower Frances Haugen said the company knows its platform has done little to tackle hate speech. Chris Delmas/AFP via Getty Images

Editor's pick

Newsweek cover
  • Newsweek magazine delivered to your door
  • Unlimited access to
  • Ad free experience
  • iOS and Android app access
  • All newsletters + podcasts
Newsweek cover
  • Unlimited access to
  • Ad free experience
  • iOS and Android app access
  • All newsletters + podcasts