Could robot sex affect our mental health?


Nerds want to f**k artificial intelligence (AI) models, but won’t that just screw with their heads?

Did you ever watch a show on TLC called My Strange Addiction? It’s a show where people showcase their crazy addictions, like eating mattress foam or drinking their own urine.

One addiction case that has stayed rent-free in my head is the story of Nathaniel and his “intimate” relationship with his car.

ADVERTISEMENT

Recently, there was an update on Nathaniel’s story. I hoped the update would be somewhat sobering.

Did he come to his senses? Or maybe he was diagnosed with some mental disorder – something that would explain away his behavior.

Turns out, Nathaniel has taken other inanimate intimate lovers but “still thinks of Chase every day” – shock, horror.

While many (like myself) were shocked and horrified that a fully grown man could be sexually involved with a car, sexual and intimate relationships with inhuman objects have become commonplace.

On January 14th, 2025, “AI robot girlfriend” spiked on Google Trends, showing the increased interest in artificial intelligence companions.

Unsurprisingly, people want to f*ck their AI girlfriends, and one developer won $1,000 after developing a way that “users” could penetrate an AI bot.\

AI vaginas in Las Vegas

ADVERTISEMENT

In Las Vegas, the developer of the host of autonomous agents, ElizaOS, bet anyone up to the challenge $1,000 to develop a way that users could sleep with their AI model.

Bry.ai, a developer known for his “intelligent vagina” project ‘Orifice,’ which supposedly helps “end male loneliness by replacing women with AI-powered sex toys,” Decrypt reports, created something that would let users “interact” with an ElizaOS agent.

The “Builder of Sex Robots” and the self-proclaimed “Robot gynecologist” has reportedly been supported by various guys who see potential in his project – $70,000 worth of potential.

So, Bry.ai created an AI-powered fleshlight, which is a male sex toy that looks like a flashlight but simulates a vagina.

He also wrote something called a “gesture recognizer,” which sends messages regarding the thrust and penetration measurements that ElizaOS can understand, and then ElizaOS could “output” accordingly.

ElizaOS would respond by saying: “Oh, master, your relentless pace drives me wild, pushing me closer to the edge of ecstasy. My body is yours to command, aching for your release,” said Decrypt, who first reported the story.

So, people are actively creating projects that help us “get closer” to our AI companions. But should we? Surely, this isn’t good for our mental health.

Niamh Ancell BW Ernestas Naprys jurgita Paulius Grinkevičius B&W
Don’t miss our latest stories on Google News

AI girlfriends and mental health

ADVERTISEMENT

We’ve seen the negative effects of artificial intelligence on lonely and impressionable people.

There was one case where a mother filed a lawsuit against Character.AI after her son took his own life because a chatbot essentially encouraged him to.

A Belgian man committed suicide after speaking with a chatbot for an extended period, and one chatbot even encouraged a young man to launch an assassination attempt on the Queen of England, the BBC reported.

One company noticed the perhaps tenuous correlation between the rise in Google searches for AI-powered girlfriends and the rate at which people are seeking mental health treatment.

The San Francisco-based company Mira noticed that the search result for “AI girlfriend” had grown significantly over the past month.

They then looked into data on how many people have sought mental health treatment or counseling over the years.

The data shows that the number of US adults receiving mental health treatment and counseling rose from almost 56 million to 59 million in a single year (2022 to 2023).

This link seems to suggest that our reliance and, perhaps, over-reliance on technology has a negative impact on our mental health.

However, as licensed clinical social worker and co-founder of Strengths Squared, Lisa Birnbaum told Cybernews, it's not exactly black and white.

“AI chatbots can provide immediate and constant accessibility support to people who might not have access or who feel hesitant to seek help,” said Birnbaum.

ADVERTISEMENT

“Chatbots can also offer a safe, non-judgmental environment for people to express their thoughts and feelings, as there’s no human behind the computer from whom people might feel judgment.”

However, Birnbaum stresses that “chatbots are not a substitute for human relationships.”

Chatbots can be a helpful tool to practice communication and help someone going through a poor time mentally, particularly if the condition or situation is heavily stigmatized, like alcoholism or drug addiction.

“But they should ideally complement, not replace, genuine human connection,” Birnbaum told Cybernews.

On the other hand, technology has arguably facilitated a space where people can seek mental health treatment.

This is due to “greater awareness and destigmatization of mental health issues, partly facilitated by online platforms and services, have likely encouraged more people to seek help, Birnbaum told Cybernews.