Copilot bug permits ‘AI’ to learn confidential Outlook emails
Abstract created by Good Solutions AI
In abstract:
- PCWorld experiences on a important Microsoft Copilot bug (CW1226324) that permits the AI to scan and summarize confidential Outlook emails, bypassing privateness protections.
- This vulnerability impacts Microsoft 365 accounts and compromises delicate information like contracts and medical info saved in Despatched and Drafts folders.
- Microsoft is rolling out a repair, however the timeline stays unclear, elevating vital issues about AI reliability and information privateness safety.
For all its supposed intelligence, “AI” appears to make lots of silly errors—for instance, scanning and summarizing emails marked “confidential” in Microsoft Outlook. That’s the most recent concern with Microsoft’s Copilot assistant, in response to a bug report from Microsoft itself.
Copilot Chat in Microsoft 365 accounts is ready to learn and summarize emails within the Despatched and Drafts folders of Outlook, even when they’re marked confidential… a mark that’s particularly designed to maintain automated instruments out. BleepingComputer summarizes the difficulty labeled “CW1226324” and says {that a} repair is being rolled out to affected accounts. There’s no timeline for when the repair might be out there for all customers. (Sadly, the total report isn’t out there for viewing by most of the people—you want Microsoft 365 admin privileges simply to see it.)
The issue is, as you may guess, alarming. The confidential characteristic in Outlook is commonly used for issues like enterprise contracts, authorized correspondence, authorities or police investigations, and private medical info. It’s the sort of stuff you completely don’t need scanned by a big language mannequin, and undoubtedly not sucked up into its coaching information, as is so usually the case.
Microsoft isn’t saying what number of customers are affected, however it’s saying that “the scope of impression could change” because it investigates the issue. How comforting. That’ll actually get folks to start out utilizing Copilot, proper?

