Apple Inc. will scan photos on select iPhone and iPad to hunt ‘Child Abuse’ imagery: Is it a privacy nightmare or excessive surveillance?

Apple Inc. Scan CSAM iPhone iPad
Apple Inc. will initiate automated scans on select iOS and iPadOS devices to search for CSAM. Pic credit: Pabak Sarkar/Flickr

Apple Inc. has invariably ruffled a few more features with its latest policy. The company will automatically scan iPhones and iPads to hunt Child Sexual Assault Material (CSAM).

After upsetting nearly every tech giant, including Google and Facebook with its App Tracking Transparency (ATT), Apple Inc. may have upset several privacy advocates.

The hunt for CSAM will begin but only in the U.S:

Apple Inc. has confirmed it is working on a new feature that would go live in iOS 15 and iPadOS 15. The company will automatically go through the images or photos on select iPhone and iPad devices.

The company has confirmed that will scan the images only to hunt for images that definitively link to child abuse or exploitation. Technically speaking, Apple will automatically scan images on a user’s device to see if they match previously-identified CSAM content.

Simply put, Apple is going to great lengths to affirm that images have a specific set of digital identifiers. This method is quite common for cybersecurity companies dealing with sex trafficking.

Incidentally, there will be thousands of Apple iPhone and iPad devices which the company will not scan at all.

Apple Inc. will initiate automated scans only on those devices that have a child in the network. Simply put, if an iOS or iPadOS device isn’t linked to a family network as belonging to a child, Apple won’t scan them.

Apple device users who exchange content that is questionable in nature, would obviously be a part of the automated scans. Cybersecurity experts have always warned smartphone and internet users not to share sexual content.

As an extension of the abovementioned policy, Apple will ignore those devices that have no indicative signs of being involved in CSAM.

Lastly, the latest policy will come into effect only in the United States of America, for now. It is not clear if or when Apple will automatically start scanning devices outside America.

Apple Inc.’s new initiative goes a lot farther to protect children from predators who use digital communication tools:

Apple’s decision to automatically scan iPhone and iPad devices has several security experts and privacy advocates up in arms. The company may have noble intentions but its actions would obviously be a violation of privacy.

The CSAM-scanning feature doesn’t appear to be optional. Moreover, Apple will almost certainly include the same in iOS 15 and iPadOS 15 by default.

Simply put, as and when an iPhone and iPad updates for free and automatically, the CSAM-scanning feature will be embedded. And as with Apple devices, removing such system-level features is next to impossible.

Incidentally, Apple Inc. is going way beyond merely scanning photos to hunt for CSAM. Apple has reportedly added two systems that parents can optionally enable for children in their family network.

An on-device analysis in the Messages app can scan incoming and outgoing photos for material that might be sexually explicit. The iMessage app will automatically blur questionable content by default. An optional setting can also inform account-linked parents if their children have viewed the content.

Besides these features, Apple is pushing for better access to CSAM reporting mechanism. Apple’s Siri and Search platform will proactively bring up helpful resources if a user asks about reporting CSAM. The platforms will also caution anyone who attempts to search for CSAM.

Apple proactively prioritized protection from tracking by deploying a system-level security prompt. The ATT has already proved its success as more than 90 percent of Americans, have denied consent to being tracked.

Similarly, Apple could be proactively taking a stand against the omnipresent threat to young and vulnerable children who are increasingly turning to the Internet. While Apple Inc.’s intentions are certainly noble, automated scans of user data will, and already has, upset many.

Subscribe
Notify of
guest

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x