Microsoft AI will make you speak foreign tongue on Teams: what could go wrong


A new AI-powered feature from Microsoft will allow people attending video meetings to hear speakers “talk” in a different language. That’s great in theory – in reality, the tech giant is expanding the threat landscape in corporate environments, critics say.

Microsoft unveiled the new AI interpreter on November 19th. On Teams, it can simulate speaker voices and offers near-real-time voice interpretation in nine languages – Chinese (Mandarin), English, French, German, Italian, Japanese, Korean, Portuguese (Brazil), and Spanish.

According to company representatives, the new feature aims to democratize access to interpreters. Human translators for international video calls are expensive.

ADVERTISEMENT

“Imagine being able to sound just like you in a different language. The Interpreter agent in Teams provides real-time speech-to-speech interpretation during meetings, and you can opt to have it simulate your speaking voice for a more personal and engaging experience,” Microsoft said in a press release.

The feature is currently being tested by a limited group of users and will be more broadly available in 2025 for accounts with a Microsoft 365 Copilot license.

There’s more. Meeting transcriptions will soon support multilingual meetings. Up to 31 translation languages will be supported for a meeting transcript that is also provided in the language that was originally spoken.

Ernestas Naprys Konstancija Gasaityte profile Paulius Grinkevicius jurgita
Get our latest stories today on Google News

That’s because Microsoft, speaking with The Washington Post, admitted that the interpreter in Teams – just like most AI interpretations, really – may not be 100% accurate.

AI translations are also usually less lexically rich than those from human interpreters, researchers say. These translators struggle to accurately convey colloquialisms, analogies, and cultural nuances.

However, analysts say that security challenges are the most concerning. Audio deepfakes are already a big problem – impersonation scams cost more than $1 billion last year alone, the US Federal Trade Commission said in April.

Just this year, fraudsters used deepfake technology to arrange a bogus Teams video conference call and elaborately trick a finance worker at a multinational firm into paying out $25 million.

ADVERTISEMENT

Sure, not much is known about the new Teams interpreter, and Microsoft might have done a good job in protecting the tool. However, researchers say that bad actors will probably try to abuse it anyway.

“Ever be North Korean but want to sound American? It's now possible!” said vx-underground, an anonymous threat analyst group, regularly posting bulletins on X regarding threat actors.

“Yeah, we rip on Microsoft a lot. But for each feature they add, they're just expanding the threat landscape in corporate environments (and potentially home users). Should we be more optimistic? Maybe. Are we optimistic? Hell nah.”

Of course, job interviews are also arranged on Teams. A new report by Secureworks, released in October, warned that members of a stealthy North Korean hacking group were now applying for IT jobs at companies across the US, the UK, and Australia. Once hired, the bad actors can steal the firm’s trade secrets.

Cybersecurity pros say companies and organizations worried about the possibility of an impersonation scam should probably opt out of using the interpreter and voice simulator on Teams.