
Microsoft AI Team Accidently Leak Private Data (image-www.pixabay.com)
Microsoft AI research team accidently leaked 38 terabyte of private data including passwords to Microsoft services, secret keys, and more than 30,000 internal Microsoft Teams messages from more than 350 Microsoft employees.
These leaked data contain full backups of two employee”s computers. As per the Wiz report, the researchers shared their files using an Azure feature called SAS tokens, which allows you to share data from Azure Storage accounts. But the access file shared can be limited to specific files, but in this case the link was configured in a way to access the entire storage account and thereby 38 Tb of private files were leaked.
In an interview with CRN, Crowdstrike CEO George Kurtz came up with intense criticism against Microsoft in its security shortcomings. He said the security trade-offs that come with adopting the Microsoft security stack are not worth it.
Wiz called out that software giant is failing ‘an example of the new risks organizations faces when starting to leverage the power of AI more broadly as more of their engineers now work with massive amounts of training data’. Wiz asks for additional security checks and safeguards for the massive amount of data as the scientists and engineers race to bring new Al solutions to the production.
As per the research done Witz, they identified a GitHub repository under the Microsoft organization named robust-models-transfer. And this repository belongs to Microsoft AI research division and its purpose is to provide open-source code and AI models for image recognition. The URL provided allowed access to more than just open-source model.
Adding to the potential issues, according to Wiz, is that it appears that this data has been exposed since 2020.
Wiz already contacted Microsoft earlier this year, on June 22 to warn them about their discovery. Two days later, Microsoft invalidated the SAS token, closing up the issue. Microsoft carried out and completed an investigation into the potential impacts in August.
Microsoft’s Security Response Centre said to TechCrunch “no customer data was exposed, and no other internal services were put at risk because of this issue.”