Think twice before consulting with AI. Your intentions might soon be feeding the new AI economy, researchers from the University of Cambridge warn.
Everything you do online leaves a trace, and these digital footprints generate significant income for the ad industry. It has become the new norm that ads on Meta platforms, or Google, target you based on your digital preferences. However, AI is bringing this to a whole new level.
A group of researchers at the University of Cambridge is forecasting a dystopian future that may not be that far away. While AI assistants are becoming increasingly popular, they might be playing a double game and selling you off to the big tech.
While being able to guess our needs and influence decision-making at an early stage, AI models could also sell this information straight to the companies interested in selling you the solution to your problem.
Researchers from Cambridge’s Leverhulme Center for the Future of Intelligence (LCFI) note that we are on the verge of what they call a “lucrative yet troubling new marketplace for digital signals of intent.”
Such an economy of intent might stretch from regular shopping to voting and politics and be extremely profitable for advertisers at your expense.
We share too much data with AI models
Trusting your data to popular chatbots that serve as tutors, assistants, or even virtual girlfriends might backfire, as tech companies behind the models could try to monetize the personal data provided to the models.
The researchers write that large language models (LLMs) could be used to target, at low cost, a user’s cadence, politics, vocabulary, age, gender, online history, and even preferences for flattery and ingratiation.
It is especially concerning, as often-used AI models have a treasure trove of sensitive and intimate information about users' psychological state, behavior, or even their deepest desires.
“What people say when conversing, how they say it, and the type of inferences that can be made in real-time as a result, are far more intimate than just records of online interactions,” said LCFI Visiting Scholar Dr. Yaqub Chaudhary.
Combined with the data gathered from online activities, AI models will have a powerful data set that could allow for “social manipulation on an industrial scale.”
“Tremendous resources are being expended to position AI assistants in every area of life, which should raise the question of whose interests and purposes these so-called assistants are designed to serve,” adds Chaudhary.
For decades, attention has been the currency of the internet giants, but inner motivations could be the next big digital currency.
“It will be a gold rush for those who target, steer, and sell human intentions,” said Dr. Jonnie Penn, a historian of technology from Cambridge’s LCFI.
“We should start to consider the likely impact such a marketplace would have on human aspirations, including free and fair elections, a free press, and fair market competition before we become victims of its unintended consequences.”
Tech giants are aiming to employ AI to hunt intentions
While the intention economy is still an “aspiration” for the tech industry, major tech companies have hinted that they’re working on it.
In 2024, Apple’s new “App Intents” developer framework for connecting apps to Siri (Apple’s voice-controlled personal assistant) includes protocols to “predict actions someone might take in the future” and “suggest the app intent to someone in the future using predictions you [the developer] provide.”
Last year, OpenAI, in a blog post, called for “data that expresses human intention… across any language, topic, and format.” At a conference in 2023, the director of product at Shopify spoke of chatbots coming in “to explicitly get the user’s intent.”
Nvidia’s CEO has openly discussed using LLMs to identify intentions and desires, while Meta introduced "Intentonomy," a dataset for understanding human intent, in 2021.
“These companies already sell our attention. To get the commercial edge, the logical next step is to use the technology they are clearly developing to forecast our intentions and sell our desires before we have even fully comprehended what they are,” said Chaudhary.
Your email address will not be published. Required fields are markedmarked