Apple Inc. has invariably ruffled a few more features with its latest policy. The company will automatically scan iPhones and iPads to hunt Child Sexual Assault Material (CSAM).
After upsetting nearly every tech giant, including Google and Facebook with its App Tracking Transparency (ATT), Apple Inc. may have upset several privacy advocates.
The hunt for CSAM will begin but only in the U.S:
Apple Inc. has confirmed it is working on a new feature that would go live in iOS 15 and iPadOS 15. The company will automatically go through the images or photos on select iPhone and iPad devices.
The company has confirmed that will scan the images only to hunt for images that definitively link to child abuse or exploitation. Technically speaking, Apple will automatically scan images on a user’s device to see if they match previously-identified CSAM content.
Apple explains how iPhones will scan photos for child-sexual-abuse images#Apple #technology #cellphonehttps://t.co/P91fbnp4ZI
— Alt-Ctrl-Dark-Matter (@nophreak) August 8, 2021
Simply put, Apple is going to great lengths to affirm that images have a specific set of digital identifiers. This method is quite common for cybersecurity companies dealing with sex trafficking.
Incidentally, there will be thousands of Apple iPhone and iPad devices which the company will not scan at all.
Hey @martinfowler @KentBeck @unclebobmartin @timberners_lee what do u think about it? https://t.co/e16jS1KQs4
— Paul (@RegExTrex) August 8, 2021
Apple Inc. will initiate automated scans only on those devices that have a child in the network. Simply put, if an iOS or iPadOS device isn’t linked to a family network as belonging to a child, Apple won’t scan them.
Apple device users who exchange content that is questionable in nature, would obviously be a part of the automated scans. Cybersecurity experts have always warned smartphone and internet users not to share sexual content.
Call me cynical, but when I read this, my immediate thought is: Apple wants to normalize the idea of a company scanning the contents of your phone. So, Apple begins the process by citing a justification it expects people will be disinclined to object to. https://t.co/ihnkFqnlfC
— Patterico (@Patterico) August 8, 2021
As an extension of the abovementioned policy, Apple will ignore those devices that have no indicative signs of being involved in CSAM.
Lastly, the latest policy will come into effect only in the United States of America, for now. It is not clear if or when Apple will automatically start scanning devices outside America.
Apple Inc.’s new initiative goes a lot farther to protect children from predators who use digital communication tools:
Apple’s decision to automatically scan iPhone and iPad devices has several security experts and privacy advocates up in arms. The company may have noble intentions but its actions would obviously be a violation of privacy.
Child sexual abuse material and the abusers who traffic in it are repugnant, and everyone wants to see those abusers caught.
— Will Cathcart (@wcathcart) August 6, 2021
The CSAM-scanning feature doesn’t appear to be optional. Moreover, Apple will almost certainly include the same in iOS 15 and iPadOS 15 by default.
This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable.
— Will Cathcart (@wcathcart) August 6, 2021
Simply put, as and when an iPhone and iPad updates for free and automatically, the CSAM-scanning feature will be embedded. And as with Apple devices, removing such system-level features is next to impossible.
Incidentally, Apple Inc. is going way beyond merely scanning photos to hunt for CSAM. Apple has reportedly added two systems that parents can optionally enable for children in their family network.
Privacy Whistleblower Edward Snowden and EFF Slam Apple's Plans to Scan Messages and iCloud Imageshttps://t.co/khdvkEVwVX
— The New Oil (@TheNewOil1) August 8, 2021
An on-device analysis in the Messages app can scan incoming and outgoing photos for material that might be sexually explicit. The iMessage app will automatically blur questionable content by default. An optional setting can also inform account-linked parents if their children have viewed the content.
Besides these features, Apple is pushing for better access to CSAM reporting mechanism. Apple’s Siri and Search platform will proactively bring up helpful resources if a user asks about reporting CSAM. The platforms will also caution anyone who attempts to search for CSAM.
Mocking "Think of the children " is a bad one. But pointing out 1) the very serious risk CSAM scanning gets a abused to scan for other things and 2) questioning how much child abuse it will stop … Is exactly the balancing act Apple should have done and didn't. https://t.co/u1DiyccCX2
— Ian Miers (@secparam) August 8, 2021
Apple proactively prioritized protection from tracking by deploying a system-level security prompt. The ATT has already proved its success as more than 90 percent of Americans, have denied consent to being tracked.
Similarly, Apple could be proactively taking a stand against the omnipresent threat to young and vulnerable children who are increasingly turning to the Internet. While Apple Inc.’s intentions are certainly noble, automated scans of user data will, and already has, upset many.