Facebook Worse Than Google, Other Platforms on Misinformation, Hate, Whistleblower Says

Frances Haugen said she was recruited to work at Facebook in 2019, just before the highly-contentious presidential election and unforgettable year of 2020. The social media giant that employed her the next several months became one of the top platforms for divisiveness, hate and misinformation, she said as an eventual whistleblower.

In her time, there was not just the presidential election. There was a global pandemic, a move for social justice reform, the death of George Floyd, the postponement of sports school, business and life, and the Capitol Hill insurrection for an attempt to overturn the 2020 election.

Haugen, who had previously worked at Google and Pinterest, said she was hired to address misinformation circulating at Facebook. She appeared on 60 Minutes to discuss some issues.

"The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook, and Facebook over and over again chose to optimize for its own interests, like making more money," Haugen said on the show.

Haugen shared loads of documents with the Security and Exchanges Commission (SEC) and the Wall Street Journal, according to the report. Not just an email here or there, and not just a slip of the lip at the water cooler.

"I've seen a bunch of social networks, and it was substantially worse at Facebook than anything I've seen before," Haugen said Sunday night. "At some point in 2021, I realized I'm going to have to do this in a systemic way, that I'm going to have to get out enough [documents] that no one can question that this is real."

Facebook
A smart phone screen displays the logo of Facebook on a Facebook website background, on April 7, 2021, in Arlington, Virginia. Photo by OLIVIER DOULIERY/AFP via Getty Images

Haugen said Facebook scrapped its civic integrity team after the 2020 presidential election was complete, which twisted her gut on the company's direction at that point. She thought the company's disbandment of an "integrity team" opened the door for Facebook to be used as a staging point for the January 6 riots at the Capitol.

"They basically said, 'Oh good, we made it through the election, there weren't riots, we can get rid of civic integrity now,'" Haugen said. "Fast forward a couple of months, and we had the Insurrection. When they got rid of civic integrity, it was the moment where I was like, 'I don't trust that they're willing to actually invest what needs to be invested to keep Facebook from being dangerous.'"

Facebook spokesperson Lena Pietsch told CNN Business on Sunday that its apps do more good and less harm than people might think. Pietsch said heading off misinformation remains a top priority.

"Every day our teams have to balance protecting the ability of billions of people to express themselves openly with the need to keep our platform a safe and positive place," Pietsch said. "We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true."

A Facebook representative said Sunday that not only is research shared internally, but viewed by a third party for further discussion.

"We do a huge amount of research, we share it with external researchers as much as we can, but do remember there is ... a world of difference between doing a peer-reviewed exercise in cooperation with other academics and preparing papers internally to provoke and inform internal discussion," said Nick Clegg, Facebook's vice president of global affairs.

Haugen, during her whistleblower interview Sunday night on TV, ultimately said Facebook was more about the dollars, all while triggering emotions.

"One of the consequences of how Facebook is picking out that content today is that it is optimizing for content that gets engagement, a reaction, but its own research is showing that content that is hateful, that is divisive, that is polarizing, it's easier to inspire people to anger than it is to other emotions," she said. Haugen also said, "if they change the algorithm to be safer, people will spend less time on the site, they'll click on less ads, they'll make less money."