Apple delays but doesn’t cancel plan to scan iPhones and iPads for CSAM content: Will ‘collect input and make improvements’ controversial technology

Apple Inc CSAM Photo Scanning Delayed
Delay, but not cancel. Pic credit: plasticpeople/Flickr

Apple Inc. has confirmed that will delay the deployment of the controversial policy about CSAM (Child Sexual Abuse Material) scanning. The company indicated will “take additional time over the coming months to collect input and make improvements”.

After largely negative feedback and significant backlash, Apple Inc. has cooled its heels regarding scanning photo libraries of iPhones and iPads to hunt for images of child exploitation.

Apple Inc. was to scan individual devices and iCloud Photos for CSAM:

Last month, Apple Inc. announced a rather controversial policy. Although well-intentioned, the policy involved scanning personal iPhone and iPad devices.

Apple Inc. claimed it would start scanning iPhone and iPad devices. It wanted to hunt for material that corroborates with the global database on child sex, abuse, and exploitation.

Besides locally scanning Apple-branded smartphones and tablets, Apple Inc. would also go through remote cloud-hosted service, iCloud Photos. The feature would have gone live with iOS 15, iPadOS 15, and macOS Monterey later this year.

Apple Inc. has repeatedly clarified that it would not be looking at actual photos. Instead, the company would hunt for digital “fingerprints” that it matches against the CSAM database. Technically, the company would look for and confirm if the images had CSAM image hashes that NCMEC (National Center for Missing & Exploited Children) provides.

Simply put, Apple would go through the content’s metadata and other non-visual aspects to confirm if the imagery matched the global database about CSAM. Many technologies, security agencies, and companies regularly deploy such technology and rely on the well-proven CSAM database.

In case Apple Inc. discovered evidence which strongly suggested it was related to CSAM, it would involve a human monitor. This person would visually confirm the imagery and pass the information to law enforcement if necessary.

Although clearly well-intentioned, several privacy advocacy groups rose in unison opposing the controversial policy. And Apple Inc. has now notified that it will accept more input and conduct more research before deploying the policy.

Apple Inc. delays but does not cancel its plan to scan individual devices:

Needless to mention, there was a huge backlash to the controversial policy. The Electronic Frontier Foundation reportedly amassed more than 25,000 signatures from consumers. More than 100 groups also called on Apple Inc. to abandon the technology.

Presumably taking the criticism into consideration, Apple Inc. has issued a statement, which mentioned:

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Incidentally, alongside CSAM scanning, Apple Inc. will also reportedly delay communications safety features in Messages and updated knowledge information for Siri and Search.

Subscribe
Notify of
guest

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x