Skip to content

Microsoft Copilot Bug Read Confidential Emails Without Permission

Microsoft has confirmed a significant bug in its Microsoft 365 Copilot Chat feature that allowed the AI to access and summarize confidential emails without proper authorization, even when Data Loss Prevention (DLP) policies and sensitivity labels were in place. The flaw, persisted for weeks and highlights growing concerns about privacy and AI-driven data exposure in enterprise environments as fixes continue to roll out.

Microsoft has confirmed a bug in Copilot Chat — the AI assistant built into Microsoft 365 apps like Word, Excel, and Outlook — that caused it to read and summarize confidential emails without authorization. The bug has been active since January 2026.

This only affects organizations which use Microsoft hosted email and other services.

The problem: emails labeled as “confidential” (using Microsoft’s sensitivity labels) were supposed to be off-limits to Copilot. Companies use these labels as part of data loss prevention (DLP) policies — essentially rules that say “don’t feed this sensitive content to the AI.” The bug ignored those rules entirely.

Why this matters

DLP policies are supposed to be a firewall between sensitive business data and AI processing. If you’re in a regulated industry (legal, finance, healthcare, HR) you may have confidential emails that are specifically labeled to keep them away from AI tools. This bug punched a hole right through that protection, for weeks.

Microsoft hasn’t said how many customers were affected. That silence is telling.

What Microsoft is doing about it

Microsoft has started rolling out a fix. Admins can track the issue in the Microsoft 365 admin center under message ID CW1226324. If your IT team manages your Microsoft 365 environment, they should check this entry to confirm whether your organization was affected and whether the fix has been applied.

What you should do

  • If you’re an admin: Look up CW1226324 in the Message Center (admin.microsoft.com) right now. Check whether the fix has rolled out to your tenant.
  • If you’re a regular user: Ask your IT department whether your organization uses Copilot Chat and whether confidential email labels are part of your DLP setup. If yes, ask whether you were affected.
  • If you work in a regulated industry: This may have compliance implications. Document when you became aware of the bug and what steps were taken — just in case you need to show due diligence later.

The bigger picture

This isn’t an isolated concern. Just this week, the European Parliament’s IT department blocked all built-in AI features on lawmakers’ devices, citing the risk that AI tools could quietly upload confidential content to the cloud. That decision looks increasingly prescient.

The bottom line: “confidential” labels and DLP policies only protect you if the software actually respects them. This bug is a reminder that Microsoft’s AI rollout has been rushed with a focus on fast delivery over customer privacy.  With Copilot “turned on by default” doesn’t mean “it’s safe by default.”

About this author

Office-Watch.com

Office Watch is the independent source of Microsoft Office news, tips and help since 1996. Don't miss our famous free newsletter.

Office 2024 - all you need to know. Facts & prices for the new Microsoft Office. Do you need it?

Microsoft Office upcoming support end date checklist.