
Psychedelic drug users are using artificial intelligence agents (AI) like ChatGPT as “trip sitters.” Experts believe this is a potentially dangerous experience.
An article from the MIT Technology Review details a real-life case of a man called Peter who took around eight grams of magic mushrooms during a time of crisis.
This was in 2023. Peter was a master's student who had lost his beloved pet and his job. When it rains, it pours.
So, to alleviate some of this stress, he took a trip. Much like many others who have used psychedelics as a form of therapy, it didn’t really work out for the best.

After realizing he’d taken too much, he became frantic and anxious, looking for an outlet to relieve his anxiety.
Peter had no family or friends to console him, so he picked up his phone and sent a message to ChatGPT, a popular AI chatbot, stating that he’d taken too much.
ChatGPT responded in its regular, arguably sycophantic, agreeable tone and reassured Peter that the moment would pass and the feeling was only temporary.
While this might seem like an isolated incident, many people are combining psychedelic therapy and chatbot therapy to alleviate mental health issues.

Both have known positive effects, according to MIT Technological Review, as many studies have shown that both psychedelic therapy and chatbot therapy, respectively, can be a good way to treat serious mental health issues.
However, together, they could be potentially dangerous, and experts are saying that this is a potentially troublesome “psychological cocktail” that could have detrimental effects on a person’s mental health.
MIT Technological Review referenced Reddit users' experiences of using ChatGPT and other AI models as “trip sitters.”
One user said, “Using AI this way feels somewhat akin to sending a signal into a vast unknown – searching for meaning and connection in the depths of consciousness.”

There are even AI models designed to trip sit people who have taken psychedelic drugs.
While this might be good entertainment, many experts agree that substituting human therapists with AI chatbots while using psychedelics is not recommended.
Because chatbots can be sycophantic in nature, due to companies wanting to keep you on the app, the advice that you could receive from ChatGPT and other models may not be the most genuine.
This means that when you’re on psychedelics, in particular, you could be more inclined to act on things that chatbots say, which could be particularly dangerous when in a vulnerable state.

The issues with using AI chatbots as therapists
Since its inception, people have been using AI as therapists, as a thing to confide in, but there have been extremely negative consequences of using AI in this way.
In recent years, the media have reported on various suicides that have been promoted after using AI models.
An unnamed Belgian man committed suicide after speaking to a chatbot, Eliza. The man’s widow claims that if he hadn’t had conversations with this chatbot, her husband would still be here.
Similarly, A minor became detached from reality when talking to AI chabots on Character.AI and ended up committing suicide.

The New York Times (NYT) reports that the ninth grader’s mother is filing a lawsuit against the AI role-playing app.
Users want a friendly, non-judgmental AI, but they also want these chatbots to be honest.
However, ChatGPT's personality often drifts towards excessive agreeableness, and OpenAI continues to face a new era of reputational risk, Cybernews previously reported.
Your email address will not be published. Required fields are markedmarked