• AI & Fintech
  • Posts
  • Microsoft Exposed 38TB of Private AI Data, Including Passwords and Secret Keys

Microsoft Exposed 38TB of Private AI Data, Including Passwords and Secret Keys

On September 18, 2023, cloud security firm Wiz reported that Microsoft had accidentally exposed 38 terabytes of private AI data, including passwords and secret keys, for three years.

Welcome to Your Daily Dose of AI and Fintech Insights!

Microsoft Exposed 38TB of Private AI Data, Including Passwords and Secret Keys

On September 18, 2023, cloud security firm Wiz reported that Microsoft had accidentally exposed 38 terabytes of private AI data, including passwords and secret keys, for three years. The data was exposed due to a misconfigured Azure Storage account.

The exposed data included:

  • A disk backup of two employee workstations

  • Passwords to Microsoft services

  • Secret keys for Azure Storage accounts

  • Private keys for SSH, GPG, and Docker

  • Over 30,000 internal Microsoft Teams messages from 359 employees

Microsoft has since patched the vulnerability and secured the data. The company has also apologized to its customers and partners, and has promised to investigate the incident further.

The exposure of this sensitive data is a major security breach for Microsoft. It is also a reminder of the risks associated with cloud computing. Businesses that use cloud services need to be careful to take all necessary precautions to protect their data.

This incident raises a number of questions about Microsoft's security practices. For example, it is unclear why the company was storing such sensitive data in an unsecured Azure Storage account. Additionally, it is unclear how long Microsoft was aware of the vulnerability before it was patched.

Microsoft has stated that it is committed to protecting its customers' data. However, this incident suggests that the company needs to improve its security practices. Businesses that use Microsoft's cloud services should carefully evaluate the company's security posture before making a decision to use those services.

Potential impact of the data breach

The data breach could have a number of negative consequences for Microsoft and its customers. For example, the exposed data could be used to steal passwords or gain unauthorized access to Microsoft systems. Additionally, the exposed data could be used to commit identity theft or other forms of fraud.

Steps that businesses can take to protect themselves

Businesses that use cloud services can take a number of steps to protect themselves from data breaches, including:

  • Carefully review the security policies and procedures of the cloud providers they use.

  • Use strong passwords and multi-factor authentication for all cloud accounts.

  • Regularly back up their data and store the backups in a secure location.

  • Encrypt their data before storing it in the cloud.

  • Monitor their cloud accounts for suspicious activity.

By taking these steps, businesses can reduce the risk of data breaches and protect their data from unauthorized access.

Fresh AI Tools: Find the best and newest AI tools in the market here


 Now, AI products from emerging markets

  • Amini - The single source of truth for African environmental data.

  • iiDentifii - Your Trusted Partner in Remote Biometric Identity Authentication.

Our AI toolbox coming soon!


 AI & Fintech opportunities

Visit our site for more.
Thanks for reading, please share!!