Apple Drops Project to Scan iCloud Photos for Child Pornography Child Abuse Material

The arrival of iOS 15 brought with it a large repertoire of novelties to protect children from today’s big problems. In fact, three large groups of functionalities were presented. The two that are available today are communication security within iMessages and, on the other hand, warnings around these issues in Siri, Spotlight and Safari. However, Apple’s flagship feature was scanning users’ iCloud photos for the purpose of finding child pornography. After several months postponed, the project has been abandoned by Apple.

Photo scanner in search of CSAM, the Apple project

Apple’s flagship tool was known for its ethical and privacy dilemma that it raised when it was introduced more than a year ago. The purpose of the Apple tool was scan users’ iCloud photos for child pornography. This term is not used officially but the term used was CSAM (Child Sexual Abuse Material or Material of child sexual abuse).

child safety

Some child protection features that are currently available on iOS

To do this, those from Cupertino allied themselves with the NCMEC, the National Center for Missing & Exploited Children, United States. This center has a large database with child pornography images or CSAM. each of these photos has a signature or hash that does not vary That is, if an image with sensitive content has the same signature as one that the user has, the alarms will go off.

Before an image is stored in iCloud Photos, that image is checked against the unreadable set of known CSAM signatures on the device. This matching process is based on a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. Private Set Intersection (PSI) allows Apple to know if an image hash matches known CSAM image hashes, without knowing anything about mismatched image hashes. PSI also prevents the user from knowing if there is a match.

Related article:

How Apple’s new anti-child pornography system works (and how it doesn’t work)

Apple assured that the probability of a false alarm being given was minimal since you had to have more than 30 photos whose signatures or hash were identical to the CSAM database for Apple to intervene. However, both the technological, ethical and security community as well as Apple employees themselves caused an avalanche of criticism that caused the function to be postponed and not see the light of day in iOS 15.

Project to scan photos for child pornography is halted

A few hours ago an Apple statement was published on WIRED where announced the abandonment of development of this iCloud photo scanner for child pornography:

Following extensive expert consultation to gather feedback on the child protection initiatives we proposed last year, we are deepening our investment in the Communications Security feature that we first made available in December 2021. In addition, we have decided not go ahead with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies tracking their personal data, and we will continue to work with governments, children’s advocates and other companies to help protect young people, preserve their right to privacy and make the Internet a safer place. for the children and for all of us.

Therefore, Apple abandons a project that began a little over a year ago. All this caused by the barrage of security problems and criticisms that the measure had since its presentation. In fact, now those of Cupertino try to whitewash the problems trying to Invest at the source, trying to prevent child pornography from being produced investing in other types of measures such as iMessage’s Communications Security feature.