Hundreds of Russian US election disinformation posts remain on social media despite alerts


Two days after US authorities uncovered a Russian-linked social media influence campaign aimed at swaying the 2024 US presidential elections, hundreds of posts remained on TikTok, Facebook, X, and Rumble – even after the platforms were notified.

The US government had accused two employees of the Russian state media network RT on Wednesday of coordinating the disinformation influence campaign, which included 400 posts by Tenet Media, the online content company at the heart of the case.

On Friday, hose posts were still accessible on TikTok, unlabeled and untouched.

ADVERTISEMENT

Of all the major platforms where Tenet distributed its videos, so far only Alphabet's YouTube has taken action penalizing the company, pulling down the main Tenet Media channel along with four others operated by owner Lauren Chen on Thursday.

The only other change detected by Reuters to those accounts involved an advertisement Tenet had placed on Instagram, which started running in August and was still active as of Wednesday, but was disabled by Thursday.

None of the other social media companies responded to Reuters requests for comment on how they planned to handle the posts or whether Tenet Media was in violation of their platforms' rules.

Meta, the parent company of Facebook and Instagram, also would not clarify whether it or Tenet had removed the Instagram ad. Tenet Media likewise did not respond to a Reuters request, nor did Chen or Liam Donovan, the two people named in its incorporation records.

The platforms' apparent inaction on the campaign is a striking departure from the aggressive efforts they have touted in recent years to expose secretive foreign propaganda campaigns, reflecting both the novelty of the tactics allegedly used and the fraught politics of policing content posted by real people inside the United States.

It also exposes a fresh challenge faced by the platforms as Russia increasingly turns to unwitting American social media stars to covertly influence voters ahead of US elections this year, a sort of digital update to Cold War-era practices of laundering messages through journalists or front media outlets, according to disinformation researchers.

"What we're ultimately grappling with is a problem that exists in the real world. It's manifesting on social media in the sense that the entity has a presence there, but it isn't a social media problem per se," said Olga Belogolova, a disinformation professor at Johns Hopkins School of Advanced International Studies and former head of influence operations policy at Meta.

The US Justice Department said on Wednesday that two RT employees worked with foreign nationals in the United States to set up a company in Tennessee that paid prominent conservative commentators to post regular videos on topics designed to amplify political divisions in the United States.

ADVERTISEMENT

That company paid $8.7 million to the production companies of three of the online stars it recruited and its founders received more than $760,000, according to the indictment. The commentators did not know the funding came from RT, the Justice Department said.

Though the indictment did not name the company, details provided in court filings match up with Tenet Media, a Nashville-based company.

The offline nature of the alleged relationships between RT, Tenet Media, and the US commentators makes the case unusual in the world of online influence operations, which social media companies began cracking down on after US intelligence concluded that Russia had used Facebook as part of a campaign to help former President Donald Trump win the White House in 2016.

Moscow denies involvement

Moscow has denied that claim, as it also denied the US allegations on Wednesday. RT responded to the charges with ridicule.

Most major online platforms now label state-affiliated media organizations, while Meta, TikTok and YouTube owner Google all produce either monthly or quarterly reports to document their ongoing removal of coordinated networks of fake accounts.

The companies also have rules requiring users to disclose sponsorships by applying "branded content" and "paid partnership" labels to relevant posts, tools generally used by influencers paid to promote clothes, makeup and other products to their thousands of followers.

Meta defines branded content as "a creator or publisher's content that features or is influenced by a business partner for an exchange of value, such as monetary payment or free gifts," according to its documentation explaining the rules.

Taking action against Tenet-related content, however, entails dealing with accounts that are neither fake, nor directly state-run, nor doing traditional product placements, while also wading into the thorny politics of moderating the speech of real US conservative personalities.

Politicians on the right have accused social media platforms of censoring their speech. Meta CEO Mark Zuckerberg has been extending an olive branch their way, most recently in a letter to Congress last month in which he expressed regret about some of his company’s moderation decisions.

ADVERTISEMENT

Belogolova, the former Meta staffer, said social media companies would be wise to deliberate carefully before applying their rules in ways that could create dangerous precedents for legitimate speech.

"I can guarantee you, having been on the other side of something like this, that there are conversations happening right now about the policy levers that exist and what would be appropriate and inappropriate to use in this particular situation, and trying not to make snap decisions," she said.