Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Cloud

10/2/2019
10:00 AM
Ameesh Divatia
Ameesh Divatia
Commentary
Connect Directly
LinkedIn
RSS
E-Mail vvv
50%
50%

Controlling Data Leakage in Cloud Test-Dev Environments

The focus on digital transformation and compressing development release cycles is appealing, but that means security can be left behind. How should security practitioners address this challenge?

Enterprises are increasingly using the cloud for their test and development spin-ups, to make development instances quickly available for their application development teams. This process empowers organizations to assume a faster release cycle, use DevOps, and enjoy the nimbleness and flexibility of the cloud. But it also introduces increased security and privacy risks when developers migrate sensitive data, in the clear, into the cloud.

A basic cloud migration flow to support this development model involves a source database of production data and a non-production or development target database that developers can use to build and test their applications against. Sometimes, there are tools available to help you move large data sets to Amazon Web Services (AWS) or Azure using database migration services. Inevitably, using clones of production data can create a replica of sensitive information in a cloud database in the clear that may violate security and compliance policies or may not receive the same level of security that a production environment typically receives.

The downstream effect is that a non-production deployment without protection may be left running, and as a result, an attacker or Internet scanner may uncover cleartext records. Over the past few years, there have been several instances of open or non-production environments having their data exposed.

The Security Challenges of This New Reality
In a cloud migration, developers as well as database administrators can access all of your data, in the clear, even if you are using an at-rest encryption solution or tablespace/transparent data encryption to protect that data.

Furthermore, several recent data exposures have been linked to unattended or misconfigured cloud deployments. Last year's Starwood/Marriott breach involved about 339 million records — and may draw a $124 million fine under the General Data Protection Regulation (GDPR).

First American Financial Corporation leaked 885 million records starting as far back as 2003.

And more recently, there is the example of Capital One's AWS deployment and a misconfigured web application firewall. In that scenario, an unauthorized user was able to access sensitive data records, putting 106 million customers at risk.

Transparent data encryption (TDE) was unable to protect these companies or their users because it was never meant to do so. It was designed to protect against what we call the "Tom Cruise threat model" — where someone breaks into a data center and drops down from the ceiling (as in Mission Impossible) to steal disks that hold your data. The reality is that hackers aren't physically breaking into data centers in today's world. They are hacking using compromised credentials and then moving laterally in your environment. Encrypting the physical disks or using database encryption at-rest does nothing to protect the data from these modern-day attacks.

The "shared responsibility model" requires users to secure everything in the cloud, while the cloud provider ensures the security of the cloud. One cannot blame a cloud provider for open or misconfigured buckets. That has been and will always be the cloud user's responsibility.

New Attacks Require New Methods of Defense
It is clear  that traditional approaches (such as encrypting data at rest and in motion) are no longer enough to protect against new methods of attacks, particularly as developers spin up and migrate to cloud test and dev environments.

For far too long, security practitioners have used these technologies as a "check the box" method to achieve compliance. Modern attacks, however, require us to rethink our processes to defend what is most important: the data itself, not the systems or perimeter defenses surrounding it.

I'm encouraged in particular by MongoDB's announcement in June 2019 that it will begin to implement "field-level encryption," which enables users to have "encrypted fields on the server — stored in-memory, in system logs, at-rest and in backups — which are rendered as ciphertext, making them unreadable to any party who does not have client access or the keys necessary to decrypt the data." While it would allow limited operations on that encrypted data, it certainly is a step in the right direction. More companies in security should recognize that the traditional approach to encryption is inadequate to defend what's most important.

Proactive, Not Reactive, Measures
To prevent a significant amount of data breaches, which trigger significant regulatory fines, why not nip this issue in the bud? Regulators would do well to expand mandates to encrypt data in the entire environment from "at rest" and "in motion" only to also include "in memory" and "in use." Doing so would prevent some data breaches — especially in the cases described above — negative headlines, legal and reputational issues, and regulatory fines before they occur. Building encryption into the data migration process would protect sensitive data at all times, automatically, preventing inadvertent exposure.

The consequences of inaction are growing significantly, as regulators have clearly caught on to the importance of data privacy. Just look at the $5 billion Federal Trade Commission fine against Facebook for its failure to protect data from abusive third parties (such as Cambridge Analytica). Five billion dollars is about 10% of the company's 2018 revenue, and 20% of its 2018 profits.

GDPR, which went into effect in mid-2018, sparked a wave of new data privacy regulations in the US. The most significant of these is the California Consumer Privacy Act, which provides unprecedented power for consumers to control the collection, use, and transfer of their data. Up to 40 other states are also in various stages of implementing data privacy regulations.

Putting it All Together
As I wrote in a previous Dark Reading column, security should not be a bottleneck and slow down business functions. When done correctly, security can actually empower a business and create a sustainable competitive advantage.

Related Content:

Check out The Edge, Dark Reading's new section for features, threat data, and in-depth perspectives. Today's top story: "The Inestimable Values of an Attacker's Mindset & Alex Trebek."

Ameesh Divatia is Co-Founder & CEO of Baffle, Inc., which provides encryption as a service. He has a proven track record of turning technologies that are difficult to build into successful businesses, selling three companies for more than $425 million combined in the service ... View Full Bio
 

Recommended Reading:

Comment  | 
Print  | 
More Insights
Comments
Threaded  |  Newest First  |  Oldest First
COVID-19: Latest Security News & Commentary
Dark Reading Staff 5/28/2020
Stay-at-Home Orders Coincide With Massive DNS Surge
Robert Lemos, Contributing Writer,  5/27/2020
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: Can you smell me now?
Current Issue
How Cybersecurity Incident Response Programs Work (and Why Some Don't)
This Tech Digest takes a look at the vital role cybersecurity incident response (IR) plays in managing cyber-risk within organizations. Download the Tech Digest today to find out how well-planned IR programs can detect intrusions, contain breaches, and help an organization restore normal operations.
Flash Poll
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2020-11844
PUBLISHED: 2020-05-29
There is an Incorrect Authorization vulnerability in Micro Focus Service Management Automation (SMA) product affecting version 2018.05 to 2020.02. The vulnerability could be exploited to provide unauthorized access to the Container Deployment Foundation.
CVE-2020-6937
PUBLISHED: 2020-05-29
A Denial of Service vulnerability in MuleSoft Mule CE/EE 3.8.x, 3.9.x, and 4.x released before April 7, 2020, could allow remote attackers to submit data which can lead to resource exhaustion.
CVE-2020-7648
PUBLISHED: 2020-05-29
All versions of snyk-broker before 4.72.2 are vulnerable to Arbitrary File Read. It allows arbitrary file reads for users who have access to Snyk's internal network by appending the URL with a fragment identifier and a whitelisted path e.g. `#package.json`
CVE-2020-7650
PUBLISHED: 2020-05-29
All versions of snyk-broker after 4.72.0 including and before 4.73.1 are vulnerable to Arbitrary File Read. It allows arbitrary file reads to users with access to Snyk's internal network of any files ending in the following extensions: yaml, yml or json.
CVE-2020-7654
PUBLISHED: 2020-05-29
All versions of snyk-broker before 4.73.1 are vulnerable to Information Exposure. It logs private keys if logging level is set to DEBUG.