AI Shubka
  • Home
No Result
View All Result
AI Shubka
  • Home
No Result
View All Result
AI Shubka
No Result
View All Result
  • Home
  • Affiliate & Tool Guides
  • AI & Future Tech
  • AI Learning & Tutorials
  • Business & Digital Strategy
  • Gadgets & Reviews
  • Motivation & Personal Growth
Microsoft Copilot Chat error sees confidential emails exposed to AI tool

Microsoft Copilot Chat error sees confidential emails exposed to AI tool

ShubkaAi by ShubkaAi
February 19, 2026
in AI & Future Tech, AI breakthroughs (GPT updates, generative models), Best AI tools for creators, Robotics & automation, Tech forecasts
0
585
SHARES
3.2k
VIEWS
Summarize with ChatGPTShare to Facebook


However, some experts warned the speed at which companies compete to add new AI features meant these kinds of mistakes were inevitable.

Copilot Chat can be used within Microsoft programs such as Outlook and Teams, used for emails and chat functions, to get answers to questions or summarise messages.

“We identified and addressed an issue where Microsoft 365 Copilot Chat could return content from emails labelled confidential authored by a user and stored within their Draft and Sent Items in Outlook desktop,” a Microsoft spokesperson told BBC News.

“While our access controls and data protection policies remained intact, this behaviour did not meet our intended Copilot experience, which is designed to exclude protected content from Copilot access,” they added.

“A configuration update has been deployed worldwide for enterprise customers.”

The blunder was first reported by tech news outlet Bleeping Computer, external, which said it had seen a service alert confirming the issue.

It cited a Microsoft notice saying “users’ email messages with a confidential label applied are being incorrectly processed by Microsoft 365 Copilot chat”.

The notice added that a work tab within Copilot Chat had summarised email messages stored in a user’s drafts and sent folders, even when they had a sensitivity label and a data loss prevention policy configured to prevent unauthorised data sharing.

Reports suggest Microsoft first became aware of the error in January.

Its notice about the bug was also shared on a support dashboard for NHS workers in England – where the root cause is attributed to a “code issue”.

A section of the notice on the NHS IT support site, external implies it has been affected.

But it told BBC News the contents of any draft or sent emails processed by Copilot Chat would remain with their creators, and patient information has not been exposed.



Source link

SummarizeShare234
ShubkaAi

ShubkaAi

Related Stories

Reddit on the rise: What is it and why is AI search popularising it?

Reddit on the rise: What is it and why is AI search popularising it?

by ShubkaAi
March 1, 2026
0

If you do a Google search nowadays, you no longer see a list of links at the very top. Instead, you see a summary of search results curated...

Share values of property services firms tumble over fears of AI disruption | AI (artificial intelligence)

US military reportedly used Claude in Iran strikes despite Trump’s ban | AI (artificial intelligence)

by ShubkaAi
March 1, 2026
0

The US military reportedly used Claude, Anthropic’s AI model, to inform its attack on Iran despite Donald Trump’s decision, announced hours earlier, to sever all ties with the...

Can ‘friction-maxxing’ fix your focus?

Can ‘friction-maxxing’ fix your focus?

by ShubkaAi
March 1, 2026
0

Thrilled by his initial success, the artist has now traded the instant gratification of Instagram for longer and more meaningful interactions on Substack, takeaways for home-cooked meals and...

SaaS-pocalypse isn’t coming any time soon • The Register

SaaS-pocalypse isn’t coming any time soon • The Register

by ShubkaAi
March 1, 2026
0

Opinion Say goodbye to the SaaS-pocalypse theory, which posits that advances in AI will bring the software-as-a-service market to its knees. Say hello to "a feedback loop with...

Next Post
New research suggests folks are relying on ‘outdated visual cues’ to identify AI-generated faces—but I did get 14/20 on the test

New research suggests folks are relying on 'outdated visual cues' to identify AI-generated faces—but I did get 14/20 on the test

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Ai Shubka

AI-Shubka | Smarter Business. Automated Future. Helping entrepreneurs and creators earn more with AI tools, automation, and digital strategy.

Follow us

Recent Posts

On the Future of Species — unnatural selection – Financial Times

On the Future of Species — unnatural selection – Financial Times

March 1, 2026
New to Claude? Use these 6 simple starter prompts to unlock better answers instantly

New to Claude? Use these 6 simple starter prompts to unlock better answers instantly

March 1, 2026

Weekly Newsletter

© 2026 aishubka - Smarter Business. & Automated Future. by aishubka.

Powered by
►
Necessary cookies enable essential site features like secure log-ins and consent preference adjustments. They do not store personal data.
None
►
Functional cookies support features like content sharing on social media, collecting feedback, and enabling third-party tools.
None
►
Analytical cookies track visitor interactions, providing insights on metrics like visitor count, bounce rate, and traffic sources.
None
►
Advertisement cookies deliver personalized ads based on your previous visits and analyze the effectiveness of ad campaigns.
None
►
Unclassified cookies are cookies that we are in the process of classifying, together with the providers of individual cookies.
None
Powered by
No Result
View All Result
  • Home
  • Affiliate & Tool Guides
  • AI & Future Tech
  • AI Learning & Tutorials
  • Business & Digital Strategy
  • Gadgets & Reviews
  • Motivation & Personal Growth

© 2026 aishubka - Smarter Business. & Automated Future. by aishubka.