The way to your heart is through… ChatGPT

More people are using ChatGPT as a wingman and some suffer from a heartbreak of deepfakes, according to McAfee.

Nearly one in four Americans have used ChatGPT to help create pictures and other content for a dating app, with the majority of those saying they got more interest and better responses as a result.

A study from the cybersecurity software firm McAfee also found that 64% of people would distrust their interactions with a love interest if it was discovered they used AI-generated imagery on their profile.

While half of Americans said they would use AI to write their Valentine’s Day card or other love messages, even more – almost one in six – said they would be hurt or offended by that.

Yet the use of AI to craft a compelling message or fix bad lighting in a photo is relatively harmless. The increasingly accessible and sophisticated technology can be exploited in much more "nefarious" ways, according to Steve Grobman, chief technology officer at McAfee.

“It has never been more challenging to protect yourself – and your heart – from potential scammers online. Cybercriminals are evolving their tactics at the speed of AI, as we develop our defense mechanisms,” Grobman said in an email.

According to the study, one-third of Americans said they’d had an online love interest who turned out to be a scammer, and 42% of people said they’d come across fake profiles or photos that looked AI-generated in the past year.

A similar number also reported encountering deepfake content, and almost one in 10 said they’d fallen victim to a deepfake scam. About half of those lost money as a result. About half of the victims said they parted with more than $1,000, and 11% said they lost more than $10,000.

“With 58% of Americans using, or having used, dating websites, apps, or social media to find love, the rise of easily accessible and powerful AI tools has added complexity to the dating scene,” Grobman said.

In the past month, there has also been a rise in Valentine-theme malicious file, URL, and spam campaigns, rising 25%, 300%, and 400%, respectively.

“Sincere users of these platforms are seeking personal and intimate relationships, not to be blindsided and exploited by scammers,” Grobman said. “Further, the format of interactions on dating apps makes it easy for scammers to stay anonymous.”

When it comes to online dating, romance-seekers should be guided by a “healthy dose of skepticism,” he said, offering the following advice to protect privacy, identity, and personal information:

  • Scrutinize any texts, emails, or direct messages you receive from strangers. There are a few tell-tale signs of an AI-written message. For example, AI-generated messages might lack substance.
  • Do a reverse-image search of any profile pictures your love interest uses. If they’re associated with another name, or with details that don’t match up, it’s likely a scam.
  • Never send money or gifts to someone you haven’t met in person, even if they send you money first. Scammers often send money to soften up their victim and build trust. Likewise, don’t share personal or account info, even if the other person is forthcoming with theirs.
  • Talk to someone you trust about this new love interest. It can be easy to miss things that don’t add up when you’re emotionally invested and hopeful. So, pay attention to your friends or family when they show signs of concern, and take the relationship slowly.
  • Limit who can view and share your posts on social media. By setting accounts to private and being mindful of who you add as friends or followers, you reduce the likelihood of your images being misused.
  • Consider using AI-driven scam protection tools to block dangerous links that appear in text messages, social media, or web browsers.