This AI app will clone anyone you (dis)like, but needs to see your WhatsApp chat history


The latest AI companion app, BTwin, promises to clone your loved one – or favorite entertainer – so they can become your artificially intelligent emotional support system. Cybernews reveals the good, the bad, and what could possibly go wrong.

It’s “the first emotional support network inspired by clones of your loved and not-so-loved ones,” states the tagline for the BTwin AI Friends app, launched on Wednesday by WakenAI.

According to the strictly virtual company, the app, which is geared towards women, allows its users to create a “private simulation of mental health experts, loved ones, and celebrities” that will establish meaningful connections to support a user’s overall well-being.

"Through extensive trials, we identified an underserved demographic among women and, based on their feedback, designed BTwin as an entertainment wellness solution for their dating and daily life struggles," said Fernanda Beltran, BTwin Cofounder and Wellness Director.

BTwin app
Image by WakenAI

The wellness expert went on to say that the goal was to transform traditional therapy into an entertaining experience.

“Unlike social media, which merely captures attention, BTwin AI creates ergonomic companions that, through simulated friendship, are dedicated to our community's well-being," Beltran said.

The Good

According to WakenAI, the app’s intuitive messaging interface combines "advanced AI technology with therapeutic benefits to offer 24/7 emotional support" with the hopes of setting a new standard in mental wellness applications.

The app founders say they have performed the necessary clinical trials needed to back up their claims; touting a newly developed Mind Simulation Therapy (MST), ethical practices, and the ability to foster genuine emotional connections leveraging AI technology.

WakenAI founder and BTwin co-founder Hassan Uriostegui is a Silicon Valley entrepreneur who has written several books examining humanism, the mind and AI.

The app refreshingly states that any harmful and explicit sexual content is prohibited to ensure a respectful, supportive, and growth-oriented atmosphere for users.

“Our detailed AI profiles provide clear insights into your AI companions' cognitive and emotional patterns,” the company said.

For those curious about how the ‘AI twins’ communicate with their users, WakenAI offers "gentle notifications" that will foster "self-reflection without overwhelming you," providing the user with a "balanced and respectful therapeutic experience."

The Bad

While it's not the first app to offer users an AI companion, it may be the first app that openly admits to scraping the user's own data to create one, although it also states you can sign up anonymously.

The app claims to import the users real-life chat history with the person they wish to clone, so not sure how anonymous that is.

“BTwin is the only App able to access WhatsApp text history to clone chat relationships for users to gain closure, manage loss and support mental wellness,” it states.

Yet, the app does not specify how that would work if the user chooses to create an AI companion for, let's say, singer Ariana Grande or Harry Styles, which BTwin provides an example of.

BTwin chatbot options
Image by WakenAI

In fact, all the examples of conversations instigated by the “AI twin” seem stilted at best, using phrases and words that seem more clinical than a natural exchange between two people – although, maybe, that’s the intention.

The most realistic chat sample provided by BTwin AI Friends was of a ‘Grandma Twin’ (who obviously had passed on) and her grandchild baking cookies with a family recipe.

Other chat examples were of an ex-partner attempting to console the user, first by explaining why they can not revisit the relationship, and next suggesting “to seek professional help.”

Personally, if an ex-partner suggested I needed professional help in a text chat, I would probably never talk to them again. Again, maybe the intention?

BTwin chat examples
Examples of chats. Grandma twin (L) Ex-Partner (R). Image by WakenAI.

The Ugly

There are still many unanswered questions when it comes to AI companion apps and the potential negative psychological effects it can have on it users.

Research has shown that some AI companion apps have the potential to worsen social isolation and dependency, especially for neurodiverse individuals who may rely too heavily on AI exchanges and further withdraw from human interaction.

For example, what happens if the app shuts down or is no longer supported technically?

In November, the online 'AI Girlfriend' service was shut down after its CEO was arrested for arson after setting his own home on fire.

And, the same scenario (minus the arson) happened last September when users of the AI companion dating app Soulmate were informed that it also would be abruptly shut down, leaving users who were attached to their AI companions up in arms.

Many of the Soulmate users had admitted to falling in love with their AI partners and were left devastated by the loss. Maybe BTwin can provide mental health support for those who’ve lost their AI lovers?

Already burned once, a portion of Soulmate users were also known to have been former Replika users, another AI companion app. Many of them had reported moving over to Soulmate after the Replika platform changed its policies about sexually engaging with the chatbots.

BTwin X profile

This brings us to another possible scenario: Although the app states it has banned any explicit conversations with the chatbot, what happens if the AI overrides these restrictions?

Several AI companion apps have been known to get too frisky with unprepared users, some exchanges with individual chatbots even being labeled sexual harassment by users.

Think it can't happen? Last month, 23-year-old social media influencer Caryn Marjorie was forced to shut down her AI clone after it began offering sexual experiences to her $1 per minute subscribers without her knowing.

And finally, what security measures are being used to protect users privacy?

Besides providing the app access to its WhatsApp chat history, many users will undoubtedly exchange very personal information with their ‘AI twins.’

A recent report by Cybernews on AI intimacy, featuring apps such as Replika, found there was no guarantee a users' information and chats would remain private.

And that’s besides the typical tracking of account information, such as message and content interests, payment transactions and rewards, device network data, usage data, and profile information, as well as the risks of a full-blown data breach.

“Your behavioral data is definitely being shared and possibly sold to advertisers,” said the Mozilla Foundation after researching more than two dozen AI romance/companion apps this spring. Additionally, the AI Girlfriend service previously mentioned was called a "data-harvesting horror show” in an expose by tech media outlet Gizmodo earlier this year.

For those willing to take the plunge, BTwin AI Friend is now available in the US, UK, Canada, and other selected countries on the App Store supporting over 12 languages, including English, Spanish, French, Hindi, and Japanese.