Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Cloud

10/2/2019
10:00 AM
Ameesh Divatia
Ameesh Divatia
Commentary
Connect Directly
LinkedIn
RSS
E-Mail vvv
50%
50%

Controlling Data Leakage in Cloud Test-Dev Environments

The focus on digital transformation and compressing development release cycles is appealing, but that means security can be left behind. How should security practitioners address this challenge?

Enterprises are increasingly using the cloud for their test and development spin-ups, to make development instances quickly available for their application development teams. This process empowers organizations to assume a faster release cycle, use DevOps, and enjoy the nimbleness and flexibility of the cloud. But it also introduces increased security and privacy risks when developers migrate sensitive data, in the clear, into the cloud.

A basic cloud migration flow to support this development model involves a source database of production data and a non-production or development target database that developers can use to build and test their applications against. Sometimes, there are tools available to help you move large data sets to Amazon Web Services (AWS) or Azure using database migration services. Inevitably, using clones of production data can create a replica of sensitive information in a cloud database in the clear that may violate security and compliance policies or may not receive the same level of security that a production environment typically receives.

The downstream effect is that a non-production deployment without protection may be left running, and as a result, an attacker or Internet scanner may uncover cleartext records. Over the past few years, there have been several instances of open or non-production environments having their data exposed.

The Security Challenges of This New Reality
In a cloud migration, developers as well as database administrators can access all of your data, in the clear, even if you are using an at-rest encryption solution or tablespace/transparent data encryption to protect that data.

Furthermore, several recent data exposures have been linked to unattended or misconfigured cloud deployments. Last year's Starwood/Marriott breach involved about 339 million records — and may draw a $124 million fine under the General Data Protection Regulation (GDPR).

First American Financial Corporation leaked 885 million records starting as far back as 2003.

And more recently, there is the example of Capital One's AWS deployment and a misconfigured web application firewall. In that scenario, an unauthorized user was able to access sensitive data records, putting 106 million customers at risk.

Transparent data encryption (TDE) was unable to protect these companies or their users because it was never meant to do so. It was designed to protect against what we call the "Tom Cruise threat model" — where someone breaks into a data center and drops down from the ceiling (as in Mission Impossible) to steal disks that hold your data. The reality is that hackers aren't physically breaking into data centers in today's world. They are hacking using compromised credentials and then moving laterally in your environment. Encrypting the physical disks or using database encryption at-rest does nothing to protect the data from these modern-day attacks.

The "shared responsibility model" requires users to secure everything in the cloud, while the cloud provider ensures the security of the cloud. One cannot blame a cloud provider for open or misconfigured buckets. That has been and will always be the cloud user's responsibility.

New Attacks Require New Methods of Defense
It is clear  that traditional approaches (such as encrypting data at rest and in motion) are no longer enough to protect against new methods of attacks, particularly as developers spin up and migrate to cloud test and dev environments.

For far too long, security practitioners have used these technologies as a "check the box" method to achieve compliance. Modern attacks, however, require us to rethink our processes to defend what is most important: the data itself, not the systems or perimeter defenses surrounding it.

I'm encouraged in particular by MongoDB's announcement in June 2019 that it will begin to implement "field-level encryption," which enables users to have "encrypted fields on the server — stored in-memory, in system logs, at-rest and in backups — which are rendered as ciphertext, making them unreadable to any party who does not have client access or the keys necessary to decrypt the data." While it would allow limited operations on that encrypted data, it certainly is a step in the right direction. More companies in security should recognize that the traditional approach to encryption is inadequate to defend what's most important.

Proactive, Not Reactive, Measures
To prevent a significant amount of data breaches, which trigger significant regulatory fines, why not nip this issue in the bud? Regulators would do well to expand mandates to encrypt data in the entire environment from "at rest" and "in motion" only to also include "in memory" and "in use." Doing so would prevent some data breaches — especially in the cases described above — negative headlines, legal and reputational issues, and regulatory fines before they occur. Building encryption into the data migration process would protect sensitive data at all times, automatically, preventing inadvertent exposure.

The consequences of inaction are growing significantly, as regulators have clearly caught on to the importance of data privacy. Just look at the $5 billion Federal Trade Commission fine against Facebook for its failure to protect data from abusive third parties (such as Cambridge Analytica). Five billion dollars is about 10% of the company's 2018 revenue, and 20% of its 2018 profits.

GDPR, which went into effect in mid-2018, sparked a wave of new data privacy regulations in the US. The most significant of these is the California Consumer Privacy Act, which provides unprecedented power for consumers to control the collection, use, and transfer of their data. Up to 40 other states are also in various stages of implementing data privacy regulations.

Putting it All Together
As I wrote in a previous Dark Reading column, security should not be a bottleneck and slow down business functions. When done correctly, security can actually empower a business and create a sustainable competitive advantage.

Related Content:

Check out The Edge, Dark Reading's new section for features, threat data, and in-depth perspectives. Today's top story: "The Inestimable Values of an Attacker's Mindset & Alex Trebek."

Ameesh Divatia is Co-Founder & CEO of Baffle, Inc., which provides encryption as a service. He has a proven track record of turning technologies that are difficult to build into successful businesses, selling three companies for more than $425 million combined in the service ... View Full Bio
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Sodinokibi Ransomware: Where Attackers' Money Goes
Kelly Sheridan, Staff Editor, Dark Reading,  10/15/2019
Data Privacy Protections for the Most Vulnerable -- Children
Dimitri Sirota, Founder & CEO of BigID,  10/17/2019
State of SMB Insecurity by the Numbers
Ericka Chickowski, Contributing Writer,  10/17/2019
Register for Dark Reading Newsletters
White Papers
Video
Cartoon
Current Issue
7 Threats & Disruptive Forces Changing the Face of Cybersecurity
This Dark Reading Tech Digest gives an in-depth look at the biggest emerging threats and disruptive forces that are changing the face of cybersecurity today.
Flash Poll
2019 Online Malware and Threats
2019 Online Malware and Threats
As cyberattacks become more frequent and more sophisticated, enterprise security teams are under unprecedented pressure to respond. Is your organization ready?
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2019-16404
PUBLISHED: 2019-10-21
Authenticated SQL Injection in interface/forms/eye_mag/js/eye_base.php in OpenEMR through 5.0.2 allows a user to extract arbitrary data from the openemr database via a non-parameterized INSERT INTO statement, as demonstrated by the providerID parameter.
CVE-2019-17400
PUBLISHED: 2019-10-21
The unoconv package before 0.9 mishandles untrusted pathnames, leading to SSRF and local file inclusion.
CVE-2019-17498
PUBLISHED: 2019-10-21
In libssh2 v1.9.0 and earlier versions, the SSH_MSG_DISCONNECT logic in packet.c has an integer overflow in a bounds check, enabling an attacker to specify an arbitrary (out-of-bounds) offset for a subsequent memory read. A crafted SSH server may be able to disclose sensitive information or cause a ...
CVE-2019-16969
PUBLISHED: 2019-10-21
In FusionPBX up to 4.5.7, the file app\fifo_list\fifo_interactive.php uses an unsanitized "c" variable coming from the URL, which is reflected in HTML, leading to XSS.
CVE-2019-16974
PUBLISHED: 2019-10-21
In FusionPBX up to 4.5.7, the file app\contacts\contact_times.php uses an unsanitized "id" variable coming from the URL, which is reflected in HTML, leading to XSS.