AI girlfriends feast on your data


The rise of AI partners means that more people are using large language models to find love and companionship. Unfortunately, there’s a high price to pay for many people – a hefty loss of privacy.

It's commonplace for companies to use, store, and share the personal information we input into their sites and apps.

This information may be ubiquitous as most of us have established a digital presence.

ADVERTISEMENT

However, the data that companion chatbots collect and store is far more personal in nature.

Many romance-oriented chatbots collect a lot of information about their users, most of which is invasive.

The Mozilla Foundation analyzed 11 romantic chatbots to determine the level of privacy and security present within the services.

One of the most startling examples outlined by Mozilla is CrushOn.AI, an invasive new AI companion that has emerged among a multitude of other personal chatbots.

CrushOn is an AI chatting service without filters. It features a “Not Safe For Work” (NSFW) platform that supposedly allows you to chat with your favorite characters.

When visiting CrushOn’s privacy policy, the company claims, like most, to take “your privacy very seriously,” yet the site collects a vast amount of unnecessary personal information about its users.

The following is an example of the type of data that AI companions could collect:

ADVERTISEMENT

This information is from CrushOn's Privacy Policy.

  • Audio/Visual data
  • Contact data
  • Device/network data
  • Financial data
  • General location data
  • Health data
  • Identity data
  • Inference data
  • Transaction data
  • User content
  • Consumer health data

CrushOn’s privacy policy states that it collects information from psychological, behavioral, and medical interventions to gender-affirming care information.

This data is supposedly used to “facilitate your chat experience” and monitor chat safety. However, business purposes are outlined as a factor for collecting personal data.

The app also claims it “may” use user content from character chats to train AI models with an identifiable way to opt out.

Meaning that the information you input into the chatbot will be used to train existing AI models.

As CrushOn.AI is an NSFW platform, we can infer that the information you might be sharing with the bot could be private and even erotic in nature.

Like most chatbots used for romance, the amount of intimate information you may share with your AI partner isn’t private.

When using these platforms, we must expect that all information we input could be used against us.

Most of the apps explored by Mozilla exposed some shocking statistics regarding the general safety of each app.

ADVERTISEMENT

A worrying 73% of the apps didn’t disclose any information regarding the way the company would mitigate security vulnerabilities.

Most companies (64%) lacked transparency surrounding encryption and whether their services were encrypted.

Almost half of all apps analyzed by Mozilla allowed users to create weak passwords.

Almost all of these apps share or sell your personal information to third parties for the purpose of targeted advertising, and 54% of apps won’t delete your personal data.

So, while 24/7 companionship and even deep emotional connections like love are potential benefits of this new generation of personal chatbots, users should think carefully about whether they’re willing to trade in their privacy to access them.