Apple Backtracks on Child Safety Update Following Criticism Over Privacy Concerns

Apple said Friday it plans to delay its "Expanded Protections for Children" iOS update that was announced last month after the company encountered negative feedback over privacy concerns.

The operating system update for Apple devices includes a tool for parents that would warn them if their child was receiving or sending sexually explicit photos. It also included expanded guidance to the Siri and Search features intended to further protect children from "unsafe situations."

But it was the third component of the update that particularly caught the eye of privacy advocates.

A feature designed to catch images of child sexual abuse would do this by filtering through users' stored iCloud images and running them against material stored in a database operated by National Center for Missing and Exploited Children. If images were found to be potentially illegal, they would be reviewed by the company and the NCMEC, which could then ultimately result in police involvement.

"Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material," Apple wrote in an addendum to its previous announcement.

"Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," Apple said.

Apple To Report Quarterly Earnings
Apple said Friday it plans to delay its "Expanded Protections for Children" update for its devices' operating system. Above, the company's logo on an Apple Store in San Francisco. Photo by Justin Sullivan/Getty Images

The move came as a surprise to some, given the company's history of going to great lengths to demonstrate its commitment to users' security protection. In 2016, Apple refused to unlock the iPhone of a mass shooter in San Bernardino, California, upon receiving a request from the FBI, citing privacy concerns.

Originally, Apple said the new technology was nothing for users to be concerned about. The company explained that the images stored by law-abiding users would likely never be seen by company operatives, given that the update worked to vet the material by running it against the NCMEC imagery, which would be stored within a user's system in the form of unreadable "hashes."

Through that system, the image review process would be undertaken by the software, which would seemingly keep the company and the user's privacy at a safe distance. In a case where an image was flagged and brought to the company's attention, Apple said it was likely the material would erroneously qualify as child sexual abuse because the detection system reportedly had a "one in one trillion chance per year" of incorrectly flagging an image.

Nonetheless, members of the privacy advocacy community remained firm in their critical reaction, viewing the move as a potential precedent for future privacy invasions. Privacy campaigner and whistleblower Edward Snowden was one of those who voiced his concerns.

"No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this," he wrote in a tweet. "Make no mistake: If they can scan for kiddie porn today, they can scan for anything tomorrow."

Newsweek reached out to Apple for further comment but didn't hear back before publication.