
AI can be used to subtly increase our spending, for example over the Christmas period. This is because it reduces the "friction" in the e-commerce process, which can encourage us to spend more.
This might get even worse as we enter the age of conversational AI agents, which, if Sam Altman is to be believed, will take off in 2025.
Research from the University of Cambridge suggests that these conversational AI agents could be extremely effective at covertly influencing us. The researchers believe this process could create a so-called "intention economy."
Boosting intent
This is when AI assistants are deployed at an early stage of our decision-making process, seeding the intent to buy something, which is then "sold" to companies. In other words, the agents make us want things we don't even know we want and then direct us to where to buy them (for a fee).
The researchers argue that this new economy could be hugely lucrative but also troubling, as it might encourage unhealthy spending. The new economy would be driven by the new wave of chatbots that are increasingly able to persuade us to do things. This is especially true as many of these bots have access to a lot of information about us, including our likes, our personalities, and our behaviors.
They believe that the latest generation of AI will combine its understanding of our online habits with its ability to "relate" to us in a way that we find comforting. It promises (threatens) to build a level of trust that could easily result in bots manipulating us for commercial gain.
“Tremendous resources are being expended to position AI assistants in every area of life, which should raise the question of whose interests and purposes these so-called assistants are designed to serve," the researchers explain.
“What people say when conversing, how they say it, and the type of inferences that can be made in real-time as a result, are far more intimate than just records of online interactions.”
Ulterior motives
While AI assistants are usually marketed in ways that suggest they are a benign and helpful influence on us, the researchers caution that when commercial realities bite, we might be the customer in much the same way that we largely already are on social media.
“For decades, attention has been the currency of the internet. Sharing your attention with social media platforms such as Facebook and Instagram drove the online economy," the researchers explain.
“Unless regulated, the intention economy will treat your motivations as the new currency. It will be a gold rush for those who target, steer, and sell human intentions.”
Starting now
Obviously the "intention economy" is not currently in place, but the researchers believe that the implications of it are so great that we should be thinking about them now. After all, they could impact not only what we buy but also how we vote, what media we consume, and how we view competition.
The researchers argue that the intention economy will take the attention economy that exists today and enable companies to plot time more accurately. In other words, they'll be able to link our attention with our actions more effectively than they can today.
“While some intentions are fleeting, classifying and targeting the intentions that persist will be extremely profitable for advertisers,” the authors explain.
“The researchers argue that the intention economy will take the attention economy that exists today and enable companies to plot time more accurately. In other words, they'll be able to link our attention with our actions more effectively than they can today.”
Next-level marketing
By being able to target consumers by things like their age, gender, vocabulary, politics, or even their susceptibility to things like flattery, the researchers believe the intention economy will give marketers unheralded power when paired with organizations wanting to achieve a certain outcome.
For instance, if an individual tells their AI assistant they're feeling tired, they might be prompted to buy something to recharge, such as a trip to the cinema.
In a more nefarious scenario, the AI assistant may even steer conversations toward particular platforms or political organizations.
Suffice it to say, this isn't the current reality, but given Altman's bullish remarks at the start of the year, it's clearly a direction the industry hopes to take.
“AI agents such as Meta’s CICERO are said to achieve human-level play in the game Diplomacy, which is dependent on inferring and predicting intent and using persuasive dialogue to advance one’s position,” the researchers conclude.
“These companies already sell our attention. To get the commercial edge, the logical next step is to use the technology they are clearly developing to forecast our intentions and sell our desires before we have even fully comprehended what they are.”
Your email address will not be published. Required fields are markedmarked