Apple to Install Software on iPhones That Will Check for Images of Child Sexual Abuse

Apple announced its intention to unroll a new update that would allow the company to detect for images of child sexual abuse stored in iCloud Photos. This announcement came paired with two new features designed to similarly protect against child abuse.

Along with the iCloud feature, the company plans to launch a new tool within the Messages app that would warn children and their parents about the receiving or sending of sexually explicit photos. Additionally, Apple announced its intention to expand guidance in Siri and Search to protect children from "unsafe situations."

News of these updates was first reported in the Financial Times where the paper wrote that the detection feature would "continuously scan photos that are stored on a U.S. user's iPhone" with harmful material being alerted to law enforcement. This announcement caught some privacy experts by surprise given the route Apple took in 2016 when it refused to unlock the San Bernardino terrorists' phone upon receiving a request from the FBI.

Storefronts And General Views Of Hamburg
Apple' said its new feature for detecting child abuse was designed with "user privacy in mind." Here, a sign of the Apple store is seen on December 28, 2020 in Hamburg, Germany. Photo by Jeremy Moeller/Getty Images

Matthew Green, a cryptography professor at Johns Hopkins University, reacted on Twitter saying, "Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems." He followed this by saying, "Imagine what it could do in the hands of an authoritarian government?"

Apple said in a statement released after the Financial Times report that its detection system is designed with "user privacy in mind." Instead of scanning images on the Cloud, it said the "system performs on-device matching using a database" of known child abuse images compiled by the National Center for Missing and Exploited Children (NCMEC). Apple wrote it transforms that database material into unreadable "hashes" that are stored on the users' device.

"Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known [child sexual abuse] hashes," the company wrote. "This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image."

Child Exploitation Task Force
In 2016, Apple did not work with the FBI to unlock the phone of the San Bernardino terrorist. Here, Lt. Paul O'Connell from the Broward County Sheriff''s Office uses an AOL account on his computer August 14, 2001 to bring online pedophiles to justice from the Broward County, Florida office. Photo by Joe Raedle/Getty Images

In conjunction with this, Apple said it uses another piece technology that ensures the safety vouchers cannot be interpreted by the company unless the voucher is flagged as a child sexual abuse image, whereupon the company will "manually review" the reported content. If deemed abusive, the company may disable the individual's account and will send a report to NCMEC which can then contact law enforcement. The company reported this technology has a "one in one trillion chance per year" of incorrectly flagging an image.

In regard to the other two new features, the Siri update intervenes if a user searches child sexual abuse content and also provides resources for reporting abuse. The update to Messages provides alerts to parents and blurs sexual content sent to a child's phone.