Facebook Pulled Over 600,000 Instagram Accounts Belonging to Users Under 13 Over 3 Months

Facebook pulled over 600,000 Instagram accounts from June to August this year belonging to users that didn't meet the minimum age requirement of 13, a Facebook executive said.

Antigone Davis, Facebook's head of global safety, said in testimony to a Senate Commerce subcommittee on Thursday that the company works to protect young people on its platforms.

"We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17," Davis said.

Davis also disputed recent research that showed the harmful effects on teenagers from Instagram. The research sparked public outrage against Facebook, prompting the company to put a temporary pause on developing a version of Instagram meant mainly for children ages 10 to 12.

For more reporting from the Associated Press, see below.

Facebook Instagram Senate Hearing
Facebook removed over 600,000 Instagram accounts belonging to users under 13 over three months. Senator Richard Blumenthal (C) speaks as Facebook Global Head of Safety Director, Antigone Davis, testifies remotely before a hearing of the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security to examine protecting children online, focusing on Facebook, Instagram, and mental health harms in Washington, on September 30, 2021. Tom Brenner/Pool/AFP via Getty Images

Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.

The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents.

For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook's own researchers who alerted the social network giant's executives to Instagram's destructive potential.

Davis said in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren't old enough to use them do not.

"This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge," Senator Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. "Revelations about Facebook and others have raised profound questions about what can and should be done to protect people."

Blumenthal and Senator Marsha Blackburn of Tennessee, the panel's senior Republican, also plan to take testimony next week from a Facebook whistleblower, believed to be the person who leaked the Instagram research documents to the Journal.

Despite the well-documented harms, Facebook executives have consistently played down Instagram's negative side and have forged ahead with work on Instagram for Kids, until now. On Monday, Instagram head Adam Mosseri said in a blog post that the company will use its time out "to work with parents, experts and policymakers to demonstrate the value and need for this product."

Already in July, Facebook said it was working with parents, experts and policymakers when it introduced safety measures for teens on its main Instagram platform. In fact, the company has been working with experts and other advisers for another product aimed at children — its Messenger Kids app that launched in late 2017.

The focused outrage transcending party and ideology contrasts with lawmakers' posture toward social media generally, which splits Republicans and Democrats. Republicans have accused Facebook, Google and Twitter, without evidence, of deliberately suppressing conservative, religious and anti-abortion views.

Democrats train their criticism mainly on hate speech, misinformation and other content on the platforms that can incite violence, keep people from voting or spread falsehoods about the coronavirus.

The bipartisan pile-on against Facebook proceeds as the tech giant awaits a federal judge's ruling on a revised complaint from the Federal Trade Commission in an epic antitrust case and as it tussles with the Biden administration over its handling of coronavirus vaccine misinformation.

Meanwhile, groundbreaking legislation has advanced in Congress that would curb the market power of Facebook and other tech giants Google, Amazon and Apple — and could force them to untie their dominant platforms from their other lines of business. For Facebook, that could target Instagram, the social media juggernaut valued at around $100 billion that it has owned since 2012, as well as messaging service WhatsApp.

Antigone Davis
Facing lawmakers’ outrage against Facebook over its handling of internal research on harm to teens from Instagram, Facebook's head of global safety policy Antigone Davis is telling Congress that the company is working to protect young people on its platforms, on Thursday, September 30, 2021. In this March 20, 2018 file photo, Davis speaks during a roundtable on cyberbullying with first lady Melania Trump, in the State Dining Room of the White House in Washington. Evan Vucci, File/AP Photo