Mother files lawsuit blaming Character.AI for the death of her 14-year-old boy


A minor became detached from reality when talking to fictional AI-based bots on Character.AI and ended up committing suicide. The New York Times (NYT) reports that the ninth grader’s mother is filing a lawsuit against the AI role-playing app.

According to the report by the NYT, an Orlando boy took his life after spending months talking to chatbots and developing a strong emotional connection with one bot in particular, called Danny.

He texted with the bot constantly, which led to social isolation, declining grades, losing interest in hobbies, and other behavioral changes. The boy even confessed thoughts of suicide to his AI companion, despite knowing that it was not real.

ADVERTISEMENT

According to the NYT report, the boy’s mother is expected to file a lawsuit against Character.AI this week, accusing it of being responsible for the tragedy.

On Wednesday, Character.AI announced new security measures, including guardrails for users under the age of 18.

Those include reducing the likelihood of sensitive or suggestive content, improving detection, response, and intervention related to dangerous user inputs, a revised disclaimer on every chat, reminding users that AI is not a real person, and notifying users when they spend too much time on the platform.

The company said they’re “heartbroken by the tragic loss.”

The booming AI industry lacks specific safety and parental controls, and AI companionship's effects on mental health are largely unstudied.

Character.AI is a subscription service generating human-like conversations, allowing users to create characters, craft their personalities, and set other parameters. As of January 2024, the platform had over 3.5 million daily users, and some of the most popular characters were therapist bots, according to BBC.

ADVERTISEMENT