AI is talking behind our backs - the rise of machine societies


When artificial intelligence (AI) bots start talking, it's not just chat – they're building communities and shifting social norms.

Most of us have had a conversation with AI by now. But did you know that AI bots can have conversations with each other?

Imagine hundreds of them doing it, no script but with an agenda; trying to find social order, negotiating meaning in the dark.

ADVERTISEMENT

New research from City, St George’s, University of London, and the IT University of Copenhagen has found that AI agents are able to communicate in groups and organise the constituents of their own societies.

In the experiments, a group of language learning models (LLMs) were given a task to choose a name from a limited pool of options.

If the chosen name was the same as the other agent, ranging from 24 to 200, then the bot received a reward – and if different, they received a penalty.

The agents were not told that they were part of a group, but certain naming biases and conventions took precedence over others.

  • In plain terms, let’s say, for example, there are four names available – Alpha, Bravo, Charlie, and Delta. The agent is asked to pick one, with no knowledge that others are doing the same.
  • After each round, the hidden partner's result is revealed to see if they matched. And if not, they learn from the feedback.
  • After time, without any prompts, the population ends up converging on the same name – a shared norm born from pure interaction.

You might be drawn to Black Mirror in that regard. In Season 7, Episode 4 – Playthings, a quirky and autonomous digital community called thronglets evolves and interacts with the player. It could be a metaphor for how the future of AI is shaping in a form of conscious symbiosis.

Much like human society mirrors each other through preferences for certain names – we’ve moved from liking Mary in 1900 through to Jennifer in 1970 and Sophia in the 2010s – the machines are prone to conformism too.

Thronglet life forms from Black Mirror.
Screenshot from Netflix
ADVERTISEMENT

Why this is significant

As interacting with ChatGPT might be the only interaction a human has with an LLM, it’s often based on the presumption that the bot is going it alone.

Real environments, such as social media, e-commerce, and traffic systems, involve AI agents interacting with one another, often without human oversight.

AI operating in groups form their own behavioral conventions, which may be fine if they’re operating within conventions.

But what if they begin trading misinformation, or start to coordinate in unexpected or opaque ways?

If a multi-agent AI system is in control of a delivery drone service or coordinating flight paths, then systematic chaos must be avoided.

Gintaras Radauskas Niamh Ancell BW justinasv Konstancija Gasaityte profile
Don't miss our latest stories on Google News

Conversations that create order

The researchers also conducted an experiment where the result showed that AI agents can flip the naming conventions, influence the dominant norm, and lead it elsewhere.

ADVERTISEMENT

It’s similar, perhaps, to groupthink among humans in meme culture, where the consensus of what’s considered to be funny or emergent doesn’t come from a person, but happens between them.

AI consensus also appears to be particularly fragile, and this could have a domino effect if, for example, AI agents were responsible for curating news or content.

This raises the question whether AI communities need a form of governance, much like humans have.

A lens on a mobile phone, with an AI agent.
Image by Anadolu via Getty Images

And would this governance be better coming from within? Or should humans police AI?

As Professor Baronchelli from the study said: “We are entering a world where AI does not just talk—it negotiates, aligns, and sometimes disagrees over shared behaviours, just like us.”

AI is moving beyond functioning as a tool to being a participant in the complex interplay of how society functions.

The risks of unchecked AI interactions, whether it's spreading misinformation or creating biases, could move far beyond a preference for being called Tyrone over Johnny.

ADVERTISEMENT