Apple’s stealthy photo scanning feature raises privacy concerns


Users are only now discovering that their Apple devices have the “Enhanced Visual Search” feature, which sends photo data to be analyzed in Cupertino giant’s servers, enabled by default. Apple’s explanations suggest that this is done while keeping data private.

The new Apple feature, introduced with iOS 18 and macOS Sequoia, is supposed to help search for landmarks, such as the Eiffel Tower, among user photos. However, it works by sending the photo data to be compared to a global index on Apple-maintained servers.

Previously, iPhones and Macs only used the on-device analysis to recognize the faces of people, pets, and other objects.

ADVERTISEMENT

As highlighted by developer Jeff Johnson, the privacy-eroding change was introduced silently and enabled by default.

“I never requested that my on-device experiences be "enriched" by phoning home to Cupertino. This choice was made by Apple, silently, without my consent,” Johnson said in a blog post.

“From my own perspective, computing privacy is simple: if something happens entirely on my computer, then it's private, whereas if my computer sends data to the manufacturer of the computer, then it's not private, or at least not entirely private.”

How does Enhanced Visual Search (EVS) work?

Apple released iOS 18 and macOS 15 on September 16th, 2024, and the company’s own overview of the new features did not mention EVS. Apple only mentioned that search now supports “natural language queries and expanded understanding” for photos and videos.

Apple briefly explains the feature in its support pages and refers to privacy. EVS allows searching for landmarks or points of interest.

“Your device privately matches places in your photos to a global index Apple maintains on our servers. We apply homomorphic encryption and differential privacy and use an OHTTP relay that hides IP addresses. This prevents Apple from learning about the information in your photos,” Apple explains.

On October 24th, more than a month later after the initial update, Apple released a highly technical document explaining EVS in more detail.

ADVERTISEMENT

“At Apple, we believe privacy is a fundamental human right,” the first sentence reads.

Apparently, when Apple devices detect a point of interest in a photo, the on-device AI creates a mathematical fingerprint: “A vector embedding is calculated for that region of the image.”

The encrypted ‘fingerprint’ is sent to Apple’s server using a relay in the middle for ‘privacy protection,’ alongside some fake queries so the server cannot recognize which one is genuine. Apple’s servers compare queries to the index and return possible matches in an encrypted form.

evs-how-it-works
Image by Apple.

However, the highly technical document does not convince Johnson.

“I don't understand most of the technical details of Apple's blog post. I have no way to personally evaluate the soundness of Apple's implementation of Enhanced Visual Search. One thing I do know, however, is that Apple computers are constantly full of privacy and security vulnerabilities, as proved by Apple's own security release notes,” he said.

The developer believes that by enabling the feature without asking, Apple disrespects users and their preferences.

“I simply have no interest in the Enhanced Visual Search feature, even if it happened to work flawlessly. There's no benefit to outweigh the risk.”

How to turn the feature off

To turn off Enhanced Visual Search at any time on your iOS or iPadOS device, you can go to Settings > Apps > Photos.

ADVERTISEMENT
enhanced-visual-search

On Macs, open Photos and go to Settings > General.

Ongoing discussion

Johnson’s blog post received a lot of publicity and both criticism and praise. Johnson addressed critics who dismissed his concerns because he admitted not fully understanding Apple's technical documentation about homomorphic encryption.

“My critics appear to argue that either I've neglected to do basic research or that I'm not qualified to raise questions about Enhanced Visual Search if I don't fully understand the technical details. Both arguments are absurd.”

Paulius Grinkevicius Marcus Walsh profile Gintaras Radauskas justinasv
Don’t miss our latest stories on Google News

The author argues that users shouldn’t need to be cryptography experts to question the privacy implications of new features.

It can’t be argued that Apple did not provide an opt-in for the feature and many are unhappy about that. The metadata of photos with “regions of interest” reach Apple servers and it might be too late to protect the current photo library’s data.

“I’m a cryptographer and I just learned about this feature today while I’m on a holiday vacation with my family. I would have loved the chance to read about the architecture and think hard about how much leakage there is in this scheme, but I only learned about it in time to see that it had already been activated on my device,” a user on Hacker News posted.

ADVERTISEMENT