Apple’s Child Protection Features Get Delayed After Privacy Outcry

WASHINGTON — Apple’s child protection features, which the company announced last month, have now been delayed by the tech giant due to criticism that the changes could diminish user privacy.

The outcry was regarding one of the features that would scan users’ photos for child sexual abuse material (CSAM). The changes had earlier been scheduled to roll out later this year.


“Last month, we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” Apple said in a statement.

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

That release detailed three major changes in the works.

One change to Search and Siri would point to resources to prevent child sexual abuse material if a user searched for information related to it.

The other two changes came under more significant scrutiny. The first would alert parents when their kids were receiving or sending sexually explicit photos and would blur those images for kids.

The second one would have scanned images stored in a user’s iCloud Photos for child sexual abuse material and report them to Apple moderators, who could then refer the reports to the National Center for Missing and Exploited Children or NCMEC.

“This feature is coming in an update later this year to accounts set up as families in iCloud for iOS 15, iPad 15, and macOS Monterey,” Apple said.

The company said that the iCloud Photo scanning system at length made the case that it did not weaken user privacy.

It scans photos stored in iCloud Photos on a user’s iOS device. It assesses those photos alongside a database of known child sexual abuse material image hashes from NCMEC and other child safety organizations.

“Siri and Search are also being updated to intervene when users perform searches for queries related to child sexual abuse material,” Apple said.

“These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.”

Still, some privacy and security experts heavily criticized Apple for the new system, arguing that it could have created an on-device surveillance system and violated the trust users had put in Apple to protect on-device privacy.

The America-based digital rights group Electronic Frontier Foundation said that the new system, however well-intended, would “break key promises of the messenger’s encryption itself and open the door to broader abuses.”

(With inputs from ANI)

Edited by Saptak Datta and Praveen Pramod Tewari



The post Apple’s Child Protection Features Get Delayed After Privacy Outcry appeared first on Zenger News.