Meta and Apple implicated in 3D web human rights warning

The cameras, sensors, and microphones needed to make extended reality (XR) 3D devices work are a data harvester’s paradise that could, once again, see our privacy turned into corporate profits, a US academic report warns.

You should think twice before signing up to an augmented or virtual reality game or fitness tracking app, suggests research from NYU Stern. If not, you could find your smallest, most personal physical reactions and movements being tracked, logged, and sold on to the highest bidder – in yet another example of how Big Tech is profiteering from the erosion of our privacy.

As two major players in the metaverse-driven XR industry, Meta and Apple potentially wield “an inordinate amount of power to extort, manipulate, and coerce,” the New York university implies.

To be clear, it stops short of outright accusing the tech giants of doing so, but draws attention to how the technology incorporated in their devices could enable such intrusive use of power.

“Privacy is recognized as a human right for a reason. Liberal democracy is rooted in the assumption that individuals are entitled to a personal sphere exempt from state (and corporate) interference,” says NYU Stern.

“Without this sphere, individuals cannot freely explore and develop their identity and convictions. We carry on daily life assuming that no one has access to our innermost thoughts, medical conditions, sexual preferences, and emotional vulnerabilities.”

And therein lies the problem with the XR technologies created by leading tech firms like Meta and Apple – they “are designed to collect and process precisely such intimate personal information.”

"Large and powerful metaverse platforms could track billions of people and impart influence on select individuals by altering the world around them in targeted and adaptive ways."

Extended reality expert Louis Rosenberg, as quoted by NYU Stern

What makes XR work also makes it dangerous

“Conventional XR hardware is equipped with sensors that continuously track at least three types of user data,” NYU researchers explain. “Head movements, eye movements, and spatial maps of physical surroundings.”

Without this facility, the technology would be unusable and virtual reality games would be far from enjoyable or practicable. The immersive experience of XR would not only be limited in its verisimilitude or realisticness, but it would also cause users to experience motion sickness and even run into physical objects in the real world while gaming.

Unfortunately, the nuanced movement tracking software that prevents such advanced gaming experiences from becoming an ordeal are also a surveillance capitalist’s dream, to paraphrase a term from academic Shoshana Zuboff cited by NYU Stern.

“Advanced hardware can track additional bodily data for even more realistic 3D immersion and virtual interaction,” it adds. “This includes facial expressions for avatar animation, hand and other limb movement for full-body immersion and multi-user interaction, pupil dilation data to render crisper visuals, iris or retina images for user authentication, and voice data to enable oral interactions and commands.”

Such features could make XR “the most dangerous tool of persuasion ever created” that pairs “real-time” surveillance and influence – and that’s according to an expert in the technology, Louis Rosenberg, cited by NYU Stern.

In Rosenberg’s words, this essentially means that “large and powerful metaverse platforms could track billions of people and impart influence on select individuals by altering the world around them in targeted and adaptive ways.”

“Such influence could easily slip into insidious manipulation, whereby users’ involuntary reactions to stimuli get fed into a system that curates their digital content and experience in a way that optimizes for specific responses,” adds NYU Stern. “For example, their decision to purchase a particular product or inclination to believe certain disinformation.”

Putting Pegasus to shame

If it were accordingly aggregated and exploited, such data would potentially give the tech companies behind XR a kind of power to put other surveillance technologies such as NSO Group’s Pegasus spyware in the shade.

“The types and volumes of data that XR devices can collect make them several orders of magnitude more invasive than traditional web-tracking and surveillance technologies,” says NYU Stern.

“When analyzed over time, they reveal an individual’s involuntary and immutable characteristics, including their vocal inflections, gait patterns, detailed facial expressions and gestures, movement idiosyncrasies, and real-time physical responses to stimuli.”

But, if virtual gaming is a level up in terms of data surveillance, fitness and wellbeing apps could go even further, NYU Stern warns.

“Depending on the immersive experience, a user may also consent to other types of physiological tracking, such as their heart rate, respiration, and blood pressure,” it says. “Such tracking may be desirable in certain fitness, wellness, and medical applications,” it adds, but “when XR systems are given access to other bodily data [...] they enable even more sophisticated analysis of users’ physical, emotional, and mental states.”

This could be used to construct “biometric psychography” profiles of a person’s individuated interests, aversions, and vulnerabilities, “based on their involuntary and often unconscious reactions to stimuli.”

This type of data collection and analysis constitutes “mind reading” in the form of a continuous recording of what users look at, how long their attention is captured, and even how they feel about what they are witnessing, NYU Stern asserts.

"Once users turn on eye tracking, hand tracking, audio, and facial-expression tracking to enhance their immersive experience, their data is subject to wide-ranging use by the company."

NYU Stern casts doubts on Meta's claims that it puts user privacy above all else

Apple doing more to protect data than Meta

Meta and Apple – two of the largest competitors in XR – are known to take divergent approaches to user data,” says NYU Stern.

Whereas Meta harvests user data to create consumer profiles for targeted advertisements, Apple makes most of its money by selling hardware and charging hefty commissions on apps and in-app purchases sold in its store.

Apple’s business model allows the company to pass itself off as “a credible guardian of privacy,” while Meta’s policy is “questionable” given its data-driven ad-based model.

“Meta’s terms of service and privacy policies applicable to its metaverse products leave the door open for the company’s continued monetization of user data collected by the devices,” says NYU Stern.

“Once users turn on eye tracking, hand tracking, audio, and facial-expression tracking to enhance their immersive experience, their data is subject to wide-ranging use by the company. Meta disclaims responsibility for the data practices of third-party developers with whom the company shares user data.”

However, that doesn’t mean that Apple gets off scot-free in NYU Stern’s book. Though it has “vowed not to collect any eye-movement data, whether raw or abstracted,” it has declined to comment on what it does with body- and face-tracking data.

NYU Stern also points out that Apple has yet to release a detailed privacy policy for its Vision Pro headset, expected to go on sale early next year.

“But if the company plans to make available any multi-user 3D immersive experiences through its new spatial computing platform, it will have to contend with how to protect sensitive motion data from misuse by third parties,” adds NYU Stern.

Laws don’t go far enough – not in the US, nor anywhere else

Whereas the state of California and the European Union are singled out for praise in their efforts to legislate against potential widespread misuse of user data by 3D Metaverse companies, the rest of the US, and indeed the world, does not come off so well in NYU Stern’s purview.

“Unfortunately, existing laws in the United States and in most parts of the world contain loopholes that make these dystopian futures all too possible,” it concludes, adding that the US lacks robust federal privacy laws.

State laws within America, on the other hand, are too narrow because they prioritize personally identifying information such as names, addresses, and financial account numbers while doing less to protect biometric data.

“California has a law that potentially protects consumers against profiling not necessarily tied to identification – but only in some cases,” adds NYU Stern.

The California Consumer Privacy Act (CCPA) protects personal data that is used “to create a profile about a consumer reflecting the consumer’s preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.”

But there is apparently a gray area in its wording that “gives California residents the right to opt out of the sale or sharing of such data, but not its collection or use by the entity doing the collection.”

Nor does the CCPA “set limits on what first-party collectors themselves can do with the data, as long as the collector provides notice to consumers.”

“Moreover, the law adopts a narrow definition of ‘business’ that potentially leaves out many content developers and small platforms that are likely to handle user data in XR,” adds NYU Stern.

When it comes to the rest of the world, the EU’s General Data Protection Regulation (GDPR) is best equipped to handle potential widespread and deep abuse of user privacy by XR companies, it adds.

“The law applies to any type of personal data that is ‘used for learning or making decisions about an individual’ and which ‘could have an impact on [that] individual,’” says NYU Stern, citing the GDPR, which it says “also bans the processing of certain categories of sensitive personal data, unless the individual provides explicit consent.”

However, once again, even the GDPR falls short, according to NYU Stern researchers, and “nearly all legal protections of personal data – in Europe, the US, and elsewhere – disappear if a consumer consents.”

"Companies can voluntarily adopt best practices on data protection and use it to provide their customers with a strong first line of defense against illegitimate practices."

NYU Stern implores Big Tech firms to take more responsibility

Big Tech must address this global issue (but will it?)

In fact, at global level the legal protections “leave most of the world’s population without a say on how technology companies use their data.”

Perhaps the biggest flaw in the legislation is the same one that has bedeviled the internet and those who would seek to regulate it for decades: “immersive platforms are borderless, but regulation is jurisdictional.”

“The GDPR primarily protects residents of the EU,” claims NYU Stern. “In the US, some states lack any type of legislation on data privacy, leaving their residents at the mercy of companies and data brokers.”

To help tackle this problem, NYU Stern urges the 3D Metaverse industry to get ahead of the curve by acting on its own accord and “not wait for regulation to act responsibly.”

“Companies can voluntarily adopt best practices on data protection and use it to provide their customers with a strong first line of defense against illegitimate practices that violate the rights to privacy and autonomy,” it adds, referencing suggested best practices in its report.

Of course, whether the likes of Meta and Apple do anything like this remains to be seen.

More from Cybernews:

Man arrested in US for smuggling military-grade electronics to Russia

Microsoft AI research team allegedly leaks 38TB of private data

Elon Musk hints X may charge all users

Russian and Chinese bots leeching billions from largest companies

Okta: hackers who breached casino giants MGM, Caesars also hit other firms

Subscribe to our newsletter

Leave a Reply

Your email address will not be published. Required fields are markedmarked