iPhones will scan your photos for child abuse

Apple will roll out new child safety features to iOS, iPadOS, and macOS to clamp down on the spread of child sexual abuse material (CSAM) on its ecosystem.

Firstly, it is introducing new technology that can scan images stored in iCloud Photos for content that depicts sexually explicit activities involving a child.

The company said this would enable it to report these instances to the National Center for Missing and Exploited Children (NCMEC).

NCMEC is a comprehensive reporting centre for CSAM and collaborates with law enforcement agencies across the United States.

Apple said its method will be designed with user privacy in mind.

First, the system will perform on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organisations.

Apple said it also transforms this database into an unreadable set of hashes securely stored on users’ devices.

“This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result,” Apple explained.

“The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.”

Another technology called threshold secret sharing will ensure that the contents of the safety vouchers cannot be interpreted unless the iCloud Photos account crosses a threshold of known CSAM content.

“The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account,” Apple said.

If the threshold is exceeded, Apple can interpret the contents of the safety vouchers associated with the matching CSAM images.

Humans will then manually review each report to confirm a match, disable the user’s account, and send a report to NCMEC.

“If a user feels their account has been mistakenly flagged, they can file an appeal to have their account reinstated,” Apple said.

Apple is also adding new tools to the Messages app to warn children and parents when receiving or sending sexually explicit photos.

On-device machine learning software will analyse image attachments and determine if a photo is sexually explicit.

“When receiving this type of content, the photo will be blurred, and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo,” Apple explained.

“As an additional precaution, the child can also be told that to make sure they are safe, their parents will get a message if they do view it.”

If a child attempts to send sexually explicit photos, they will also be warned before the photo is sent, and parents can receive a message if the child chooses to send it anyway.

This feature will be part of an update later in 2021 for accounts set up as families in iCloud for iOS 15, iPadOS 15, and macOS Monterey.

Lastly, Apple will also update Siri and Search later this year to provide additional resources to help children and parents stay safe online and get help with unsafe situations.

“For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.”

Siri and Search will also attempt to intervene when users perform searches for queries related to CSAM.

“These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue,” Apple stated.


Now read: Big changes for Facebook privacy and security settings

Latest news

Partner Content

Show comments

Recommended

Share this article
iPhones will scan your photos for child abuse