Thursday, February 19, 2026

Outlook Copilot Bug Exposes Confidential E-mails

Sergiu Gatlan (Hacker News):

Microsoft says a Microsoft 365 Copilot bug has been causing the AI assistant to summarize confidential emails since late January, bypassing data loss prevention (DLP) policies that organizations rely on to protect sensitive information.

[…]

“A code issue is allowing items in the sent items and draft folders to be picked up by Copilot even though confidential labels are set in place,” Microsoft added.

Thomas Claburn:

It’s just this sort of scenario that has led 72 percent of S&P 500 companies to cite AI as a material risk in regulatory filings.

[…]

“Although content with the configured sensitivity label will be excluded from Microsoft 365 Copilot in the named Office apps, the content remains available to Microsoft 365 Copilot for other scenarios,” the documentation explains. “For example, in Teams, and in Microsoft 365 Copilot Chat.”

[…]

In theory, DLP policies should be able to affect Microsoft 365 Copilot and Copilot Chat. But that hasn’t been happening in this instance.

Previously:

3 Comments RSS · Twitter · Mastodon


AI enthusiasts 🤝 AI doomsayers
======= Hating CoPilot ======


Kinda based. At what point do users become responsible for what they use?


This has apparently been going on for a while. Anecdotally, once Copilot gets ahold of the emails it’s not segregated internally in any way. The press just finally found out about it.

And this is not the users’ fault. This is just today’s example of the perpetual rug pull that is modern computing.

One may recall the recent story posted on this very site that Microsoft rebranded Office to Copilot and millions of users woke up to find themselves unwilling Copilot users.

Microsoft was quick to brag about their high Copilot adoption numbers as if anyone had a choice in the matter.

Leave a Comment