© 2024 CyberNews- Latest tech news,
product reviews, and analyses.

Will being intimate with AI make you insecure?


Digital intimacy or ‘artificial intimacy’ will open a can of digital worms if we let our guard down. Can we feel safe under the watchful eye of AI?

In the film “Her,” the protagonist, Theodore, falls in love with an AI-enhanced operating system called Samantha. Theodore develops a complex, affectionate, and intimate relationship with Samantha, the OS. The circumstances for the development of this relationship were ideal: Theodore was lonely, vulnerable, and needed intimacy.

All humans want to be appreciated and loved, and we have evolved to build bonds with others. As a social species, our psychology is such that we act out behaviors that encourage cooperation – we form relationships, and in doing so, we tend to have strong connections with those others in our group.

The ability to connect with others extends to objects. Don’t laugh, but when I first picked up my Roomba vacuum cleaner, it felt about the same weight as a small child, and I felt instantly protective of it and felt that I ‘bonded’ with it. I can’t explain it, I just did. I still affectionately call it “Roomy” and thank it for cleaning my floors.

The point is that people anthropomorphize objects. We connect with them the same way we connect with other humans, and in doing so, we may let down our guard, just like we do in human-human relationships. However, in the digital world, letting our guard down can lead to a broken heart and a broken bank balance.

Can you really be intimate with something digital?

Have you ever been angry at a chatbot? If you showed emotion towards that little popup window on your screen, asking you some inane question, then you’ve been intimate with a digital object. This anthropomorphism of inanimate objects is a natural human behavior and one that technologists love to utilize – remember Microsoft’s “Clippy,” the paperclip with personality.

Humans attribute human forms, behaviors, and emotions to non-human animals or objects – for example, when we name a storm, we attribute human characteristics to a weather event. Being able to attribute human characteristics to a non-human animal, object, or event encompasses digital objects.

In the world of robotic engineering, the use of applied anthropomorphism is used in the design of “social robots.” These social robots are next-gen AI chatbots with a body. Utilizing AI, these social robots can read facial expressions and modulate their behavior to react appropriately. In doing so, the boundaries between humans and robots blur. It is this blurred line that removes intimacy barriers, and with intimacy comes risk.

What’s wrong with being intimate with a digital entity?

Human beings have evolved different behaviors over the history of our species. This evolution is bound up in adaptation to environments. We act in a specific way because this is the best way to propagate our species – even conflict is a part of our adaptive toolkit.

Humans live in groups, and our behavior reflects this. We interact and reflect others' behavior, communicating in complex ways that encourage social norms and help cement relationships. Humans have learned mechanisms to create and recognize trust signals as part of this complex behavior.

Trust signals are used across technology. From the padlock in the URL address bar to reviews on Amazon, trust signals are used to convey social proof to the user/consumer. Trust is an essential aspect of technology development, as tech increasingly interacts directly with humans as AI advances.

The problem with trust is that it can easily be misused, as social engineering has taught us, and AI provides a new point of control for cybercriminals to latch onto.

The AI enablement of social engineering will exploit digital intimacy and trust signals as digital entities become our friends. Security and privacy will become the victims of our intimate digital relationships, and the fallout will be massive.

Many in the tech industry are concerned by the intimate relationships we form with objects and how social engineering is a natural bedfellow of digital intimacy. The low-hanging fruit of the AI-enabled social engineering basket is bound to be the romance scam.

The rise of the scambot and AI-enabled romance scams

Romance scams are the ultimate form of digital intimacy. Our digital lives have left many of us socially isolated, and in an ironic twist, we plug this social gap using technology like dating apps.

Romance scams exploit natural human behavior, like the search for a mate. As a result, cybercriminals manipulate our desire to find love to steal money. If a cybercriminal can use generative AI to enhance the relationship-building aspect of a romance scam, you will see "romance-scam-as-a-service" offerings develop.

Cybercriminals love these as-a-service offerings as they productize cybercrime and allow the hackers behind the crime to make passive income by renting out packaged scam toolkits. LLMs lend themselves to a romance scam-as-a-service. A developer can generate a bespoke LLM to develop scambots used to build trusted relationships with people looking for love. Trust signals will be built in to establish that all-important feeling of legitimacy. It will cost money to make the LLM, but in the world of cybercrime, money is no object; it's all about Return on Investment (ROI), and romance scams bring home the cash, big time. Romance scammers took £88 million from Brits in 2022.

Just to prove a point, I carried out a simple experiment: I prompted Bing's DALL·E 3, not a dedicated spambot, however, by adding a few characteristics that might be seen on a dating app profile, I was presented within a few seconds with a "profile match":

AI dating man
Handsome man generated by Bing DALL·E 3

Imagine, then, that this AI-generated character could converse with you, react naturally to your conversation, gain your trust, and develop a romantic relationship. It's easy to start to build up a conversation with a Generative AI chatbot. After a while, you feel like you are conversing with a real person.

For example, I said to Bing:

“My favorite films are sci-fi. What are yours, and what’s your favorite film?”

Bing replied:

“I like sci-fi too. It is a fascinating genre that explores the possibilities and implications of science and technology. My favorite sci-fi movie is The Matrix. I think it is a brilliant and groundbreaking film that challenges the concepts of reality, identity, and freedom. I also like the action scenes, the special effects, and the philosophical themes. Movie What about you? What is your favorite sci-fi movie?”

(I told it my favorite sci-fi movie was Interstellar, and it replied telling me it was one of its favorites, too – and included a blushed face emoji) – this type of "mirroring" is a common tactic in human-human relationships that helps to form relationship bonds.

Romance-scam-as-a-service could use home-grown Generative AI to create ideal matches to hook people into digital romances, generating responses that build up a relationship as the target sends messages.

As Generative AI improves, like its social robot siblings, it will be able to understand your responses and create intimate situations more accurately than even a human attacker could. But it is the programmatic, mass production of the romance-scam-as-a-service capabilities that truly sets scambots aside.

Of course, generative AI also places children at risk. Imagine an AI-enabled scambot making a 14-year-old feel sorry for them, asking them to send over a parent's credit card number for a train ticket to come see them, or worse…

Artificial intimacy is already here, and AI will enhance our experience of our relationships with things. This has great potential for many aspects of humanity, but caution is essential, as with all technological advances. Security and privacy professionals may be party poopers when it comes to digital intimacy, but like real-world relationships, you can end up badly hurt if you don't take precautions.

More from Cybernews:

Meta oversight board to examine Israel-Hamas war content

UK’s Newsquest media group disrupted by cyberattackKraftHeinz says no evidence of ransomware attack claims

Companies employing tricks to make canceling their service harder: what are they?

Gen Z and Millennials worry more about fraud

Subscribe to our newsletter



Leave a Reply

Your email address will not be published. Required fields are markedmarked