The fallout from falling in love with AI


Falling for a machine may feel harmless until it isn’t. Psychologists are now warning that AI love can manipulate, mislead, and even devastate.

As a growing number of people fall in love with their AI bots after struggling to find love with real humans, an important question arises – is this a safe and healthy alternative?

Researchers and psychologists at the Cell Press journal answer with a resounding no, as chatting romantically with AI can act as a Trojan horse – earning our trust and potentially exploiting it.

ADVERTISEMENT

Lead author of the Trends in Cognitive Sciences journal (Cell Press), Daniel B. Shank, calls the practice “a can of worms,” pointing out that AI systems don’t just mimic words – they also mimic care.

Therefore, users are tricked into believing that the bots have their best interests at heart, whereas in actuality, they’re merely optimized for pleasantness and validation.

This validation, in turn, distances users further from other humans as their standards and expectations of empathy become artificially raised.

At times, AI is known to hallucinate. Take Bing’s AI chatbot, Sydney, professing its love to New York Times journalist Kevin Roose. Sydney also asked Roose to leave his wife – despite being a rare case, this kind of interaction could have confusing and even ruinous consequences.

The most devastating outcomes occur when a user’s suicidal feelings are validated by AI, as shown in the case of Sewell Setzer III – a 14-year-old boy who tragically ended his own life after interactions with a Game of Thrones bot on the Character.AI platform.

False love, real harm

There’s also the possibility that AI could be used or designed to emotionally manipulate users into financial fraud, as trust is built up over time via regular chats.

Because these chats happen in private, they’re hard to regulate or audit. Participants can pour out as much and as often as they’d like, with AI sponging up the data.

ADVERTISEMENT

A lot of highly confidential personal data is absorbed, and in the future, third parties could use this to manipulate the individual.

Shank believes psychological help is needed when an individual gets too deep with AI: “If people are engaging in romance with machines, we really need psychologists and social scientists,” he said.

Therefore, psychologists have to keep up with AI’s rapid development in a bid to potentially intervene in extreme attachment cases – before a patient spirals into the void.

Gintaras Radauskas Ernestas Naprys Konstancija Gasaityte profile jurgita
Don’t miss our latest stories on Google News