Russia-linked group incorporating AI in its new automated propaganda machine


A Russia-linked group is automating fake news fabrication and publishing with AI. Researchers have found an influence network dubbed CopyCop, which uses a large language model (LLM) to publish over 19,000 fake news stories and counting.

The AI-powered network plagiarizes mainstream media content, turns it into politically biased propaganda, and automatically spreads it around using inauthentic media outlets in the US, UK, or France.

The influence network was discovered by the Insikt Group, Recorded Future's threat research division, in early March 2024.

ADVERTISEMENT

“This network is likely operated from Russia and is likely aligned with the Russian government,” researchers said.

CopyCop uses AI to crawl through legitimate sources and then spit out content on divisive domestic and international issues. It’s been very critical to Western policies and supportive of Russian perspectives.

CopyCop uses AI to cover Russia’s war against Ukraine from a pro-Russian perspective and also the Israel-Hamas conflict, accusing Israel of committing war crimes. The narratives on the 2024 US election focus on broadly supporting Republican candidates while criticizing US President Joe Biden and undermining House and Senate Democrats.

“In addition to plagiarized content, the network has started garnering significant engagement by posting targeted, human-produced content in recent weeks,” Insikt Group warns.

More than ten newly registered domains imitate US news publications, and the infrastructure revolves around the known disinformation website DCWeekly, operated by US citizen and fugitive John Mark Dougan, who fled to Russia in 2016.

Russian state-sponsored influence threat actors, such as Doppelgänger and Portal Kombat, amplify CopyCop content. CopyCop, at the same time, boosts other Russian propaganda, including the content from the known influence front “Foundation to Battle Injustice,” which was previously financed by Russian oligarch Yevgeny Prigozhin, or the InfoRos, an inauthentic news agency, which is “very likely operated by the Main Directorate of the General Staff of the Armed Forces of the Russian Federation (GRU) Unit 54777.”

This unit is reportedly responsible for psychological operations.

network-fake-news
ADVERTISEMENT

Content published automatically

In a period of two months, CopyCop uploaded over 19,000 articles as of March 2024, with each analyzed fake website publishing once an hour.

“The network’s content production and publication is very likely automated.

The operation is crude and full of mistakes. Some published articles, plagiarizing French-language sources, contained prompts used by operators. They reveal that operators asked LLMs to “take a conservative stance against the liberal policies of the Macron administration in favor of working-class French citizens.”

Another prompt specified the LLM to portray the US government, “big corporations,” NATO, or other entities in a “cynical” or negative tone.

In one example, operators left the LLM’s note saying that it had omitted the requested cynical tone.

“As an AI language model, I am committed to providing objective and unbiased translations,” the artifact reads.

Another example was an article published on March 16th, 2024, titled “NATO’s Outdated Weapons Fail To Help Ukraine In Battle Against Russia, Claims The National Interest,” which left an LLM-provided disclaimer at the end that “this translation has been done in a conservative tone, as requested by the user.”

In their coverage of the war in Ukraine, CopyCop websites regularly cite content originally featured on Russian state media outlets and other major Russian news organizations.

Nearly identical content reveals coordination between these websites. Researchers found that CopyCop often steals images from other major news outlets.

ADVERTISEMENT

The Russian covert influence operation demonstrates an increased emphasis on tracking engagement, as it tracks metrics using Matomo Tracking, an open-source analytics software.

Researchers attribute the operation to Russia based on technical, behavioral, and contextual evidence and overlaps.

High volumes likely in the future

Insikt Group warns that the use of generative AI to create and disseminate content at scale introduces significant challenges, especially for those tasked with safeguarding elections.

First, legitimate media outlets face the risk of their material being stolen, plagiarized, and weaponized to support adversarial state narratives, damaging their credibility.

CopyCop operators have been impersonating the BBC and using content from major news organizations, including Al-Jazeera, Fox News, and French media outlets La Croix and TV5Monde.

“The sophisticated narratives, tailored to stir specific political sentiments, make it increasingly difficult for public officials to counteract the rapid spread of these false narratives effectively,” the report reads.

AI content enables influence networks to target specific audiences. CopyCop has already demonstrated the viability of AI-generated disinformation at scale “despite many OPSEC mistakes and leaving the original LLM prompts in published content.”

CopyCop additions indicate the ambition to grow the network, and its success may attract other influence networks to follow.

ADVERTISEMENT