Former Facebook employee Frances Haugen testified before the Senate Commerce Committee's consumer protection subcommittee on Tuesday and detailed some revelatory information she had gathered during her time at the company.
Much of the testimony, as well as questions and comments from the senators, centered on the impact the social media giant has on young people. Along with discussing the company's efforts to increase its youth audience, Haugen—a former product manager on Facebook's civic misinformation team—spoke about company research on the harm its service causes to young people, especially in regards to eating disorders and body image.
Much else was discussed in the nearly three and a half hour testimony (which included a brief recess). Here is a look at five of the most surprising claims made by Haugen.

1) "The buck stops with Mark"
While it's long been known that Mark Zuckerberg holds the ultimate power within Facebook as its CEO and chairman, Haugen explained the depth of his control. She said Zuckerberg "holds a very unique role in the tech industry" in that he holds "over 55 percent of the voting shares at Facebook."
"There are no similarly powerful companies that are as unilaterally controlled," Haugen said, adding that "the buck stops with Mark. There's no one currently holding him accountable but himself."
She also mentioned how Facebook has claimed in the past that it had immunity under Section 230 and thus had "the right to mislead the court." (Section 230 of the Communications Decency Act says an interactive computer service can't be treated as the publisher or speaker of third-party content.)
2) Though Facebook has the power to change, it needs intervention by the government.
"If they make $40 billion a year, they have the resources to solve these problems," Haugen said when discussing the litany of issues created by Facebook. However, she said the company has time and again "put their astronomical profits before people."
Throughout her testimony, Haugen repeatedly called on Congress to step in to regulate and define rules for the world's largest social network.
3) The company has monitored how China, Iran and other foreign countries have used the service for espionage.
Haugen said her team "directly worked on tracking Chinese participation on the platform surveilling, say, Uyghur populations in places around the world. That you could actually find the Chinese, based on them, doing these kinds of things."
She went on, "We also found active persuasion of, say, the Iran government doing espionage on other state actors, so this is definitely a thing that is happening. And I believe that Facebook's consistent understaffing of the counterespionage, information operations and counterterrorism teams is a national security issue."
Haugen also said that "Facebook's very aware that this is happening on the platform, and I believe the fact that Congress doesn't get a report of exactly how many people are working on these things internally is unacceptable because you have a right to keep the American people safe."
4) Facebook changed its safety defaults around the time of the 2020 presidential election, and then it changed the settings back after the January 6 assault on the U.S. Capitol.
Haugen said Facebook changed its safety defaults in the run-up to the November election "because they knew they were dangerous and then returned them to their original defaults. They had to break the glass on January 6 and turn them back on, and I think that's deeply problematic."
5) The company is well aware of not only the harmful effects it has on young people but how the spread of misinformation has damaged society.
Haugen spoke often about how Facebook researchers extensively studied the negative impact Facebook and Instagram have on teenagers, especially in terms of mental health and body image. She also detailed how the company targeted young people, including those younger than 13.
She went on to discuss how false information spread on the network affects others. She said, "When I worked on civic misinformation, we discussed the idea of the misinformation burden, like the idea that when people are exposed to ideas that are not true, over and over again, it erodes their ability to connect with the community at large because they no longer adhere to facts that are consensus reality."
Facebook spokeswoman Lena Pietsch responded to the hearing soon after it commenced. In a statement, she said: "Today, a Senate Commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives—and testified more than six times to not working on the subject matter in question."
Pietsch's statement continued, "We don't agree with her characterization of the many issues she testified about. Despite all this, we agree on one thing: it's time to begin to create standard rules for the internet. It's been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act."