Cloud Data Breaches Are Running Rampant. What Are the Common Characteristics?
Protecting against data breaches requires detailed analysis of recent attacks for remediation and prevention.
Data continues to drive enterprise business needs and processes, spurring the need for a rapidly growing number of data stores in which to house it all. Data's inherent value and impact on business operations have made it a prime target for cybercriminals. This has led to a sharp rise in data breaches that have disrupted business continuity and compromised the security and compliance posture of enterprises globally. In these breaches, attackers are commonly exploiting blind spots and misconfigurations.
Protecting against data breaches requires detailed and close analysis of recent attacks for remediation steps and preventative measures. Issues common to recent cloud data breaches include:
1. Data Leakage Through Compliance Boundaries
A hard lesson from the past decade is that there's no going back once data goes outside of the organizational control plane and often to public repositories. For this reason, security owners are responsible for keeping tabs on any possibility of exposure, assessing fallout when exposure occurs, and acting accordingly.
Data leakage is more common than one would think and is not limited to enterprises with smaller security budgets. For example, Nestle's sensitive data was briefly exposed to the world from its internal network before surfacing a few months later in an alleged cyberattack by Anonymous.
2. Publicly Exposed Buckets
Data is an asset and should be protected as such, whether it be a backup, audit record, or arbitrary file. Access should be based on the principle of least privilege, and any deviation should be identified, monitored, and alerted upon.
This was not the case for some organizations using public cloud data stores such as Amazon S3 buckets and Microsoft Azure Blobs. Four airports in Colombia and Peru, for example, had an S3 bucket — containing 3TB of data on employees — that was publicly accessible and did not require authentication.
Meanwhile, Turkish airline company Pegasus had an S3 Bucket without password protection, exposing roughly 6.5TB of data that included personally identifiable crew information, flight charts, and secrets from the airline's EFB (Electronic Flight Bag) system.
Doctors Me, an online Japanese medical consultation site, experienced something similar. The service stored patient images, uploaded by customers, in an S3 bucket without proper access authorization and authentication controls, exposing the sensitive data of nearly 12,000 people.
3. Database Misconfigurations
In their most basic function, databases are protected internal components that hold various records or documents. They are typically encapsulated behind an application or a service that facilitates access to data according to predetermined business logic. As such, databases have the potential to enforce strict access controls and to limit network access, ensuring that only service accounts or authorized personnel can directly reach them.
Databases exposed to the Internet, especially with weak authentication, are low-hanging fruit for bad actors who are constantly scanning for exposed services. This was the case for a popular Iranian chat application called Raychat, where a simple bot detected a misconfigured instance of the service's MongoDB database. All the attackers had to do was clone the database's content, wipe the database, and leave behind a note asking for a hefty ransom.
4. Missing Encryption
Encryption is one of the top three industry best practices for data protection, but it cannot be considered a full security suite, as data protection is built in layers. When one layer is breached, it's up to the next to hold the line. We see this reflected in risk assessments and audit submissions, where each component, especially missing fundamental layer like encryption, changes the risk score of many other components that make up the entire environment. With common issues like false reporting and missing encryption, how can security owners reliably qualify and test the entirety of their known and unknown environment with assumptions and semi-reliable information?
The Department of Education in New York City struggled with this question when an online grading and attendance system was breached. Roughly 820,000 biographic records of current and former public-school students were stored without proper encryption, allowing the adversary to easily compromise and extract those.
How Can Organizations Avoid These Mistakes?
1. Take ownership of your data:
Identify sensitive and "need to be protected" assets with periodic scans of your environment to discover any unknowns and surprises, such as new data stores and network policies.
Classify and evaluate any solution within your stack that you own or consume.
2. Improve your data security posture:
Assess data stores against industry best practice and compliance standards.
Incorporate data security into your organizational DNA by creating dedicated owners, allocating resources for long-term projects, and establishing incident handling procedures.
3. Control your data:
Scope and periodically approve the target audience ensuring that only they can access it via both network policies and RBAC.
Implement a "second set of eyes" methodology when it comes to data replication, export, creation, or deletion.
4. Observe and triage anomalous activities:
Monitor activities against sensitive data and configuration changes around classified data stores.
Create an alerting and triage mechanism for abnormal business scenarios.
About the Author
You May Also Like
DevSecOps/AWS
Oct 17, 2024Social Engineering: New Tricks, New Threats, New Defenses
Oct 23, 202410 Emerging Vulnerabilities Every Enterprise Should Know
Oct 30, 2024Simplify Data Security with Automation
Oct 31, 2024