Big Tech Execs Avoid Endorsing Legislation to Protect Kids, Say They're Already Complying

Snapchat, TikTok and YouTube said during Tuesday's Senate Commerce Consumer Protection Subcommittee hearing that they already are keeping children safe on their platforms, the Associated Press reported.

"The problem is clear: Big Tech preys on children and teens to make more money," Democratic Senator Edward Markey of Massachusetts said during the hearing.

Markey went on to ask the executives—Michael Beckerman, a TikTok vice president and head of public policy for the Americas; Leslie Miller, vice president for government affairs and public policy of YouTube owner Google; and Jennifer Stout, vice president for global public policy of Snapchat parent Snap Inc.—if they would support his legislation that would give new privacy rights to children, and ban targeted ads and video autoplay for kids.

Executives said they already are complying with the proposed restrictions and are seeking dialogue with lawmakers during the drafting of the new laws.

Leslie Miller, Google's vice president for government affairs and public policy, said YouTube deleted millions of accounts created by children under age 13 this year.

"We took action on more than 7 million accounts in the first three quarters of 2021 when we learned they may belong to a user under the age of 13—3 million of those in the third quarter alone—as we have ramped up our automated removal efforts," Miller said.

Earlier this year, a separate House committee investigated YouTube's children video service, YouTube Kid, and a panel of the House Oversight and Reform Committee told YouTube CEO Susan Wojcicki that the service does not do enough to protect children from being exposed to potentially harmful material on the streaming service.

Lawmakers concluded that YouTube streams inappropriate material in "a wasteland of vapid, consumerist content" to deliver ads to the children watching. In 2019, Google agreed to pay $170 million in settlements from allegations that YouTube collected children's personal data and information without parental consent. The House Committee letter pointed out that despite this 2019 scandal, YouTube Kids still streams ads to children.

YouTube said it strives to provide children and families with parental controls and protections to limit children from viewing age-inappropriate content. YouTube also emphasized that the 2019 settlements involved the primary YouTube platform, not YouTube Kids.

For more reporting from the Associated Press, see below.

Snapchat and YouTube App
During Tuesday's Senate Commerce subcommittee on consumer protection hearing, an executive for YouTube owner Google said that it deleted millions of accounts created by children under age 13 this year. Richard Drew/Associated Press

"Everything you do is to add more eyeballs, especially kids, and keep them on your platforms for longer," Democratic Senator Richard Blumenthal of Connecticut said at the start of a hearing of the subcommittee that he heads.

The panel took testimony recently from a former Facebook data scientist, who laid out internal company research showing that the company's Instagram photo-sharing service appears to seriously harm some teens. The subcommittee is widening its focus to examine other tech platforms, with millions or billions of users, that also compete for young people's attention and loyalty.

"We're hearing the same stories of harm" caused by YouTube, TikTok and Snapchat, said Blumenthal, adding that YouTube, TikTok and Snapchat are offering only "tweaks and minor changes" in their operations to ensure young users' safety amid rising concern over the platforms' potential harm.

"This is for Big Tech a Big Tobacco moment....It is a moment of reckoning," he said. "There will be accountability. This time is different."

TikTok has tools in place, such as screen time management, to help young people and parents moderate how long children spend on the app and what they see, Beckerman said. "We are determined to work hard and keep the platform safe," he said.

The company said it focuses on age-appropriate experiences, noting that some features, such as direct messaging, are not available to younger users. The video platform, wildly popular with teens and younger children, is owned by the Chinese company ByteDance. In only five years since launching, it has gained an estimated 1 billion monthly users.

The three platforms are woven into the fabric of young people's lives, often influencing their dress, dance moves and diet, potentially to the point of obsession. Peer pressure to get on the apps is strong. Social media can offer entertainment and education, but platforms have been misused to harm children and promote bullying, vandalism in schools, eating disorders and manipulative marketing, lawmakers said.

The panel wants to learn how algorithms and product designs can magnify harm to children, foster addiction and intrusions of privacy. The aim is to develop legislation to protect young people and give parents tools to protect their children.

The company said it stores all TikTok U.S. data in the United States. The company also rejects criticisms of promoting harmful content to children.

Early this year after federal regulators ordered TikTok to disclose how its practices affect children and teenagers, the platform tightened its privacy practices for the under-18 crowd.

Snap Inc.'s Snapchat service allows people to send photos, videos and messages that are meant to quickly disappear, an enticement to its young users seeking to avoid snooping parents and teachers. Hence its "Ghostface Chillah" faceless (and word-less) white logo.

Only 10 years old, Snapchat said an eye-popping 90 percent of 13- to 24-year-olds in the U.S. use the service. It reported 306 million daily users in the July-September quarter.

The company agreed in 2014 to settle the FTC's allegations that it deceived users about how effectively the shared material vanished and that it collected users' contacts without telling them or asking permission. The messages, known as "snaps," could be saved by using third-party apps or other ways, the regulators said.

Snapchat wasn't fined but agreed to establish a privacy program to be monitored by an outside expert for the next 20 years—similar to oversight imposed on Facebook, Google and Myspace in privacy settlements in recent years.