Russia, Iran, and China are also participating in the US presidential election. American intelligence officials say they’re using AI tools to sway the population ahead of November’s vote.
All three countries are involved in secretive shenanigans, officials from the Office of the Director of National Intelligence (ODNI) and the Federal Bureau of Investigation (FBI) told reporters in a briefing.
However, Moscow is the most aggressive and skilled manipulator and is attempting to hurt Kamala Harris, the Democratic vice president, more than her opponent, former president Donald Trump.
According to US intelligence officials, Russia is emphasizing stories and comments that demean Harris’s personal qualities or positions. It has also doctored clips of Harris’s speeches to replace some words and used generative AI to create false content.
“For example, the IC (the intelligence community) assesses Russian influence actors were responsible for staging a video in which a woman claims she was the victim of a hit-and-run car accident by the Vice President and altering videos of the Vice President’s speeches,” the summary of the ODNI assessment (PDF) said.
The non-existent hit-and-run, analyzed by Microsoft researchers, became a viral hit and spread quickly on platforms such as X, the owner of which, Elon Musk, seemingly thinks all kinds of content are simply free speech.
The Microsoft Threat Analysis Center (MTAC) warned that Russian election interference efforts have shifted to spreading “outlandish fake conspiracy theories” about the Harris-Walz campaign.
The Washington Post said US intelligence officials agree with the assessment. During the briefing, they pointed to a recent indictment alleging that Russian officials invested $10 million in a Tennessee media company that employed right-wing influencers to create videos that promoted Russian interests.
Two employees of the Russian state media network RT were accused of coordinating the disinformation influence campaign. Soon, Meta, the parent company of Facebook, WhatsApp, and Instagram, banned RT and other Russian state media networks.
Thus far, however, intelligence officials have not seen evidence that AI tools are helping foreign actors “revolutionize” influence operations.
“The risk to US elections from foreign AI-generated content depends on the ability of foreign actors to overcome restrictions built into many AI tools and remain undetected, develop their own sophisticated models, and strategically target and disseminate such content,” reads the assessment.
Your email address will not be published. Required fields are markedmarked