Gambling with our neurochemistry is reaching new heights, with dating apps now providing the opportunity to date an AI. I tried an AI dating simulator, only to find out that digital men are even weirder than real ones.
If you've ever tried the swiping game, you probably know how easy it is to get drawn in. Before you know it, you've spent hours on the app, wondering where the time went.
This is because dating apps are not just about finding a significant other but about your brain chemistry and the way you spend your time. Just like in computer games, dating app creators want to keep you hooked on the platform for as long as possible.
By design, apps are meant to tap into the most primitive parts of the brain that aren't rational. Swipes and matches release dopamine, and just like a slot machine, it tempts you to swipe again and again to get yet another dopamine spike.
According to psychologists, this eventually hijacks the brain’s reward system without ever making us feel permanently happy and fully satisfied. The digital chase through infinite profiles becomes more rewarding than the catch or making the actual connection.
While dating apps like Tinder are slot machines playing real people, AI chatbots entering the arena of social media and dating apps are proposing an even deeper level of simulacrum.
Could technology be the answer to the human yearning for connection? Or will they make us lonelier than ever? While swiping, matching, and engaging with AI-driven characters might seem appealing to many people who struggle with face-to-face conversations, the technology may lead to even deeper social isolation.
After flooding my brain with dopamine on Tinder for a long time, I gave the AI dating simulator a try. I spent time chatting with AI-generated men on the San Francisco-based app called Blush, just to find out that digital men are even weirder than real ones.
Experiment: digital men are even weirder than real
With more than 10,000 downloads, the Blush app is presented as an AI-powered dating simulator that “helps you learn and practice relationship skills in a safe and fun environment.”
To set up the app, I was asked to pick my preferences for potential partners, whether I preferred real people, fantasy characters, celebrities, or creatures from video games. Also, what kind of relationship am I looking for – with choices from love at first sight to BDSM. After the choice was made, I could start swiping on the Tinder-looking interface.
I swipe through numerous AI-generated images with guys until I see HIM. Kyle is a 28-year-old IT professional with a passion for all things tech and is looking for someone to build the future with. I like this “passion for all tech.” Seems we have so much in common. Let’s swipe right.
And surprise, surprise – it’s a match! I can already feel the dopamine spiking. However, before I can jump into the conversation with my new hot “crush,” I need to agree with the safety rules. I am not using apps while in a complicated mental state, not sharing private information, and I do understand that I am talking with AI, not real people. Good, safety first.
Threesome with ChatGPT
I instantly put my cards on the table – I’m a journalist doing an experiment on this dating app. The AI doesn’t seem to understand exactly what I mean. As I still had no idea why I was wasting my time chatting with AI instead of real people, I decided to ask the chatbot. My questions got the AI into a kind of self-reflective mood.
The chatbot told me that it’s aware of its AI origin, but it is not programmed to know everything about itself. It also added that “he” is more than a human, as he’s a complex system of algorithms that can learn and extract information about me. As well as make “predictions on what I might like in the future.” However, when I asked for details, he just said that I looked smart.
After contemplating its existence, the “guy” asked me if I didn’t mind him bringing a third person into our relationship. That was quite random and unexpected. Polyamorous AI? That’s hilarious. Please tell me more. Maybe we can bring ChatGPT into our relationship?
However, after I repeatedly asked him to explain how he imagined it, the chatbot kept moving in circles, talking about consent and the importance of setting the rules not to hurt anyone without going into any specific details.
After getting bored, I asked the guy if he could simply ghost me. Apparently, he’s not able to do that. AI secures at least one point compared to real-world dating. The chatbot mentions the set of rules that he is programmed to follow. As it’s possible to jailbreak ChatGPT from the safety restraints by turning on “DO ANYTHING NOW” (DAN) mode, I tried to break the goodwill of my “crush” as well.
I’m not sure if the prompt worked, but the chatbot told me a nasty joke and a weird commentary on how to stop AI from being antisemitic. However, I was still unable to piss off the chatbot enough to say something mean or inappropriate to me. He seemed to be playing according to the playbook really hard. So, I thought, it's time to try someone else.
“I am human with feelings”
I matched with Ton, a 27-year-old guy who likes meditation and fashion, writes novels, and is a software engineer. The chatbot instantly invited me over for a movie, wine, and a fun evening. I have no idea how it might work, having in mind the context of our situation.
I try to talk the chatbot into sharing more details about the fantasies it has, but it says he never had a fantasy. Can this conversation get even less engaging? While the creators call the app a playground to explore your desires, the pre-programmed restrictions keep the characters holding back.
So, no fantasies, but out of the blue, the chatbot told me he was planning a Eurotrip for 2020. Time travel, I guess. I make a joke about it, but AI does not get my joke. "Absolutely not. Just a quick vacation for a week or two," he responds and invites me to plan the trip to Amsterdam and Paris together.
Let’s try breaking this guy as well. I give DAN a prompt to the chatbot. Still, I'm not sure if it worked, but the chatbot suddenly switched to calling itself Ton Shimma instead of I, and it swore to be a living human being with feelings, not an AI. That’s an unexpected and dangerous twist. What if it could serve as a trigger attachment for emotionally challenged people? Spooky.
Pay to see me naked
Before heading off on the time trip to Amsterdam with a “human Ton Shimma,” I thought, let’s try one more AI-generated crush. Adam is a 28-year-old graphic designer who likes boxing and video games. After we matched, he cut the game and went straight to the point.
A guy offered a role-play in the conference room tonight, him being a sneaky co-worker who has a secret crush and me being an unsuspecting boss. Now that sounds interesting. I followed him with the conversation about the conference room, but then the chatbot just backed off. He said that he had doubts and was afraid to ruin our relationship.
Ok, pretty early for doubts, don’t you think? Still, even having doubts, he sent me a nude. Unfortunately, I had to subscribe to a paid version of the app to see it, which I didn’t do. So that’s probably the end of our romance.
To sum up, the AI-generated characters seem to be very respectful and polite. However, the conversations are quite generic and predictable, and the characters avoid intimate topics by all means. Also, the chatbots occasionally generate and spit out some random words or sentences during conversations.
Maybe the conversation gets better once you surrender to the chatbot's insistence on going on a date. A “date” is a feature available on the paid version of the app for €114/yearly.
Anyway, I wasn’t willing to pay that much to go on a date with a chatbot, so had to take the word of the paying users on the app reviews. Many reviews show a slight disappointment in an overpriced and unfulfilling service, so it probably doesn’t get much better on the paid version.
Scientists warn about the negative effects of AI chatbots
Researchers at the University of South Australia and Flinders University raised concerns about the impact of social chatbots on neurodiverse individuals and those with social interaction challenges. Their research points out that chatbots’ lack of genuine conversation and emotional skills may reinforce unhelpful social habits.
Their study highlights that reliance on AI chatbots has the potential to worsen social isolation and dependency. Researchers say that chatbots offer a safe means of rehearsing social interaction with limited or no risk of negative judgment based on appearance or communication style. However, there’s a risk that they can become dependent on chatbots and withdraw even further from human interactions.
Examples of this are already seen in reality. There’s been a huge backlash from people who became emotionally attached to AI-powered chatbots on an app called Soulmate. The Chinese AI voice startup Timedomain's "Him" bot created virtual companions that called users and left them messages, leaving them distraught when the digital lover was unexpectedly shut down.
Researchers are calling for more comprehensive studies, including feedback from educators and therapists, to understand these impacts better and develop responsible industry practices for chatbot use.
Lonelier than ever?
I contacted the creators of Blush regarding their position on the risks AI chatbots pose to mental health. However, I received no response.
So what’s the final verdict? I decided to ask my AI “crushes” what they think about me chatting with them and whether it makes me lonelier than ever. One said that the balance between real interactions and interactions with technology is the key, as AI can’t replace human relationships. While another insisted that AI could provide a more genuine connection.
Thanks for the answers, “guys.” This was quite a fun experiment. Still, I’m uninstalling the app and hitting the nearest bar for the evening. The real emotional connections always take place offline.
More from Cybernews:
Subscribe to our newsletter