Microsoft confirms Copilot bug let its AI read sensitive and confidential emails
A fix has been deployed
Here at Tom’s Guide our expert editors are committed to bringing you the best news, reviews and guides to help you stay informed and ahead of the curve!
You are now subscribed
Your newsletter sign-up was successful
Want to add more newsletters?
Daily (Mon-Sun)
Tom's Guide Daily
Sign up to get the latest updates on all of your favorite content! From cutting-edge tech news and the hottest streaming buzz to unbeatable deals on the best products and in-depth reviews, we’ve got you covered.
Weekly on Thursday
Tom's AI Guide
Be AI savvy with your weekly newsletter summing up all the biggest AI news you need to know. Plus, analysis from our AI editor and tips on how to use the latest AI tools!
Weekly on Friday
Tom's iGuide
Unlock the vast world of Apple news straight to your inbox. With coverage on everything from exciting product launches to essential software updates, this is your go-to source for the latest updates on all the best Apple content.
Weekly on Monday
Tom's Streaming Guide
Our weekly newsletter is expertly crafted to immerse you in the world of streaming. Stay updated on the latest releases and our top recommendations across your favorite streaming platforms.
Join the club
Get full access to premium articles, exclusive features and a growing list of member rewards.
Microsoft confirmed that a Copilot security bug was allowing the AI assistant to read and summarize emails that were labeled as confidential. According to a report from Bleeping Computer, the bug bypassed Microsoft's data loss prevention policies, which are meant to protect sensitive information.
The bug was discovered in late January (tracked as CW1226324) and specifically affects Copilot Chat and the "work tab" feature. The bug let Copilot read and summarize emails in the sent and drafts folders, including messages that were explicitly labeled as confidential, which should have had restricted access.
Copilot Chat is Microsoft's version of Google Gemini or ChatGPT. It's meant to be content-aware and can interact with 365 apps like Word, Excel, Powerpoint and Outlook. The company began rolling it out to Microsoft 365 business customers in September 2025.
"Users' email messages with a confidential label applied are being incorrectly processed by Microsoft 365 Copilot chat," Microsoft confirmed.
The company said that an unspecified code error was responsible for the issue. A fix began rolling out in early February with Microsoft saying that it is continuing to monitor it. A final timeline for the rollout has not been revealed, nor has Microsoft stated how many organizations or individuals were affected.
That said, the issue has been tagged as "advisory," which usually means that the incident was limited in scope or impact.
"We identified and addressed an issue where Microsoft 365 Copilot Chat could return content from emails labeled confidential authored by a user and stored within their Draft and Sent Items in Outlook desktop. This did not provide anyone access to information they weren’t already authorized to see," a spokesperson told Bleeping Computer.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
Follow Tom's Guide on Google News and add us as a preferred source to get our up-to-date news, analysis, and reviews in your feeds.
More from Tom's Guide
- Windows 11 is getting 3 new features in a major quality of life update — is Microsoft starting to pull back on AI slop?
- 'Concentrate on keeping it simple': Bill Gates wanted PCs to be straightforward, and Windows 11’s AI push is a betrayal
- Windows 10 users warned to upgrade now or risk a ‘degraded security state’ as Microsoft ends Secure Boot support

Scott Younker is the West Coast Reporter at Tom’s Guide. He covers all the lastest tech news. He’s been involved in tech since 2011 at various outlets and is on an ongoing hunt to build the easiest to use home media system. When not writing about the latest devices, you are more than welcome to discuss board games or disc golf with him. He also handles all the Connections coverage on Tom's Guide and has been playing the addictive NYT game since it released.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.
