For years, Apple was seen by its customers as a bastion of privacy, trying to set its premium products apart in a market crowded with cheaper but much more data-hungry devices powered by Google’s Android.
Last week, however, human rights and privacy advocates sounded the alarm about the company’s plans to introduce a major update to iOS, iPadOS, and macOS operating systems.
Designed to combat the spread of Child Sexual Abuse Material (CSAM), the update will allow Apple to automatically scan for and compare images stored on a user’s device against databases maintained by various entities, all before the images are uploaded to their iCloud Photos account. If the images are not flagged as matches, the CSAM detection system does not learn anything about them.
On the other hand, if at least 30 images on a user’s device match child abuse photos stored in those databases, a human will review them and notify law enforcement.
(Image source: Apple)
While initially limited only to the US, Apple’s upcoming image scanning program raises concerns about creating an infrastructure that could support greater surveillance on the company’s devices in the future, never mind going against Apple’s own privacy-friendly marketing slogans.
The safety VS privacy debate
Is the tech giant’s move justified by its stated noble goals, or is it a dangerous precedent that can spark further encroachment on user privacy?
According to privacy expert and CEO of Abine/DeleteMe, Rob Shavell, Apple’s iCloud Mail has already had some form of CSAM detection features built-in, similar to those of other major cloud storage service providers like Facebook, Microsoft, and Google.
As an example, Shavell notes that in 2020 alone, Facebook’s anti-CSAM system provided more than 20 million reports to the National Center for Missing and Exploited Children (NCMEC), compared to Apple’s 265 records.
“Many people currently outraged by the implications of Apple’s announced policies have likely been complying with far more invasive behavior by other third parties for years without knowing it,” argues Shavell.
“While there are legitimate concerns about the rapid expansion of pervasive surveillance technology, and in particular, the growing cooperation between commercial data services and government, Apple’s current approach to mitigating potential child exploitation provides a simple method of opt-out for those concerned: Stop using their products and services.”
With that said, Shavell believes the claim that Apple’s upcoming CSAM scanning system does not present a unique threat to privacy is disingenuous. According to him, what’s new about Apple’s approach is that it crosses previously held privacy boundaries and involves actively monitoring content on users’ devices on the part of the company.
“Even if Apple’s current approaches provide assurances of privacy and limitation of access by third parties, the threat is that this new approach represents a significant step forward in the company’s capability for content surveillance,” claims Shavell.
“And history shows that whenever a new capability for data surveillance comes into existence, it will inevitably be exploited either by a governmental authority, or by malicious third parties. Privacy protections that the company may offer to Americans today may not apply to users in countries like, say, China, where Apple has already shown willingness to compromise user protections.”
Shavell thinks that by implementing content scanning at the operating system level, Apple forces consumers to accept new degrees of oversight that did not exist at the time when they first purchased their devices.
“This amounts to a bait-and-switch approach where consumer rights are steadily eroded over time by modifications to service agreements no one reads or understands.”
Rob Shavell
“Telling people to simply ‘stop using these devices or services’ is a glib response after the iPhone has become the single most popular consumer item on Earth. Even if private companies maintain responsible policies that limit their own uses of collected data, third parties are not limited by similar scruples.”
According to Shavell, Grindr didn’t intentionally out an Archbishop, but ‘commercially available records of app signal data’ were leveraged by others who sought to do so. He believes that once new kinds of data collection exist, they will inevitably be used in ways that the data collectors can’t anticipate.
“Today, Apple is scanning content out of a desire to limit child exploitation. But it is a very small step for the same basic approach to be used to monitor other aspects of life.”
Shavell reckons that from a legal perspective, the responsible approach is to draw a hard line between the private lives of individuals and data that service providers are allowed to collect.
According to him, some legislation, such as the 4th Amendment is Not For Sale Act, would be a welcome step forward and could prevent official government use of commercially collected data for law enforcement. However, stronger Federal personal data privacy laws are more necessary than ever.
In particular, Shavell mentions pieces of legislation like the GDPR, which require direct opt-in by consumers for data collection, limitations on data sale and sharing, as well as transparency to end-users about what is being done with personal information.
Broken from the get-go?
According to Jesse Thé, CEO of Tauria, Apple’s new hash-based image detection system is not only doomed to fail when it comes to catching its intended targets, but it’s also a threat to Western freedoms and democracy.
“Image hashing absolutely fails to catch any even moderately unsophisticated deviants, as even small, inconsequential changes to images like the removal of one line of pixels, or a slight color hue shift, can change the hash,” says Thé.
(Scheme of Apple’s hash-based CSAM detection system. Image source: Apple)
Jesse Thé believes that Apple’s announcement of iCloud photo scanning outs the company as a fair-weather supporter of privacy and that the tech giant was only using privacy as a badge of honor until it became a slightly inconvenient goal.
“Moreover, it forces Apple to employ individuals whose job is to look at and confirm child sexual abuse imagery, an incredibly damaging responsibility that private industry is far from equipped to manage.”
Jesse Thé
“This is not mathematics or physics and simply not something computer algorithms or artificial intelligence can be able to calculate accurately,” Thé told CyberNews. “There are a certain number of people who consider the nude sculptures in ancient Rome as pornography, while others consider it art. Is taking a photo of my baby during her sleep considered creepy or abusive?”
Thé considers iCloud photo scanning as an unwelcome move by Apple, masked under the current trend of protesting against child abuse or violence against women by pleasing the public, and an incorrect use of AI.
It’s also unclear whether a built-in way for law enforcement to access everyone’s iPhones will not be exploited by bad actors looking to expand the current capabilities of Apple’s CSAM detection system for their own malicious purposes.
In any case, even though Apple’s move to detect CSAM may have been spurred by a noble cause, it remains to be seen whether the company’s good intentions won’t bring about any unintended consequences in the near future.
“Whether the scanning becomes a reality or not, Apple’s new trend of utopian thinking is certainly a wrong departure for humanity - not to mention the government‘s motives behind such a move,” concludes Jesse Thé.
More from CyberNews:
The new platform that helps people recover from digital rights violations
The rise of the private surveillance industry
First half of 2021 sees triple-digit rise in cybercrime
Assume all your gadgets have been hacked or are hackable – security expert
Crypto platform to hacker who stole $600 million: please, become our chief security advisor
Subscribe to our newsletter
Your email address will not be published. Required fields are markedmarked