Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Cloud

10/2/2019
10:00 AM
Ameesh Divatia
Ameesh Divatia
Commentary
Connect Directly
LinkedIn
RSS
E-Mail vvv
50%
50%

Controlling Data Leakage in Cloud Test-Dev Environments

The focus on digital transformation and compressing development release cycles is appealing, but that means security can be left behind. How should security practitioners address this challenge?

Enterprises are increasingly using the cloud for their test and development spin-ups, to make development instances quickly available for their application development teams. This process empowers organizations to assume a faster release cycle, use DevOps, and enjoy the nimbleness and flexibility of the cloud. But it also introduces increased security and privacy risks when developers migrate sensitive data, in the clear, into the cloud.

A basic cloud migration flow to support this development model involves a source database of production data and a non-production or development target database that developers can use to build and test their applications against. Sometimes, there are tools available to help you move large data sets to Amazon Web Services (AWS) or Azure using database migration services. Inevitably, using clones of production data can create a replica of sensitive information in a cloud database in the clear that may violate security and compliance policies or may not receive the same level of security that a production environment typically receives.

The downstream effect is that a non-production deployment without protection may be left running, and as a result, an attacker or Internet scanner may uncover cleartext records. Over the past few years, there have been several instances of open or non-production environments having their data exposed.

The Security Challenges of This New Reality
In a cloud migration, developers as well as database administrators can access all of your data, in the clear, even if you are using an at-rest encryption solution or tablespace/transparent data encryption to protect that data.

Furthermore, several recent data exposures have been linked to unattended or misconfigured cloud deployments. Last year's Starwood/Marriott breach involved about 339 million records — and may draw a $124 million fine under the General Data Protection Regulation (GDPR).

First American Financial Corporation leaked 885 million records starting as far back as 2003.

And more recently, there is the example of Capital One's AWS deployment and a misconfigured web application firewall. In that scenario, an unauthorized user was able to access sensitive data records, putting 106 million customers at risk.

Transparent data encryption (TDE) was unable to protect these companies or their users because it was never meant to do so. It was designed to protect against what we call the "Tom Cruise threat model" — where someone breaks into a data center and drops down from the ceiling (as in Mission Impossible) to steal disks that hold your data. The reality is that hackers aren't physically breaking into data centers in today's world. They are hacking using compromised credentials and then moving laterally in your environment. Encrypting the physical disks or using database encryption at-rest does nothing to protect the data from these modern-day attacks.

The "shared responsibility model" requires users to secure everything in the cloud, while the cloud provider ensures the security of the cloud. One cannot blame a cloud provider for open or misconfigured buckets. That has been and will always be the cloud user's responsibility.

New Attacks Require New Methods of Defense
It is clear  that traditional approaches (such as encrypting data at rest and in motion) are no longer enough to protect against new methods of attacks, particularly as developers spin up and migrate to cloud test and dev environments.

For far too long, security practitioners have used these technologies as a "check the box" method to achieve compliance. Modern attacks, however, require us to rethink our processes to defend what is most important: the data itself, not the systems or perimeter defenses surrounding it.

I'm encouraged in particular by MongoDB's announcement in June 2019 that it will begin to implement "field-level encryption," which enables users to have "encrypted fields on the server — stored in-memory, in system logs, at-rest and in backups — which are rendered as ciphertext, making them unreadable to any party who does not have client access or the keys necessary to decrypt the data." While it would allow limited operations on that encrypted data, it certainly is a step in the right direction. More companies in security should recognize that the traditional approach to encryption is inadequate to defend what's most important.

Proactive, Not Reactive, Measures
To prevent a significant amount of data breaches, which trigger significant regulatory fines, why not nip this issue in the bud? Regulators would do well to expand mandates to encrypt data in the entire environment from "at rest" and "in motion" only to also include "in memory" and "in use." Doing so would prevent some data breaches — especially in the cases described above — negative headlines, legal and reputational issues, and regulatory fines before they occur. Building encryption into the data migration process would protect sensitive data at all times, automatically, preventing inadvertent exposure.

The consequences of inaction are growing significantly, as regulators have clearly caught on to the importance of data privacy. Just look at the $5 billion Federal Trade Commission fine against Facebook for its failure to protect data from abusive third parties (such as Cambridge Analytica). Five billion dollars is about 10% of the company's 2018 revenue, and 20% of its 2018 profits.

GDPR, which went into effect in mid-2018, sparked a wave of new data privacy regulations in the US. The most significant of these is the California Consumer Privacy Act, which provides unprecedented power for consumers to control the collection, use, and transfer of their data. Up to 40 other states are also in various stages of implementing data privacy regulations.

Putting it All Together
As I wrote in a previous Dark Reading column, security should not be a bottleneck and slow down business functions. When done correctly, security can actually empower a business and create a sustainable competitive advantage.

Related Content:

Check out The Edge, Dark Reading's new section for features, threat data, and in-depth perspectives. Today's top story: "The Inestimable Values of an Attacker's Mindset & Alex Trebek."

Ameesh Divatia is Co-Founder & CEO of Baffle, Inc., which provides encryption as a service. He has a proven track record of turning technologies that are difficult to build into successful businesses, selling three companies for more than $425 million combined in the service ... View Full Bio
Comment  | 
Print  | 
More Insights
Comments
Oldest First  |  Newest First  |  Threaded View
DevSecOps: The Answer to the Cloud Security Skills Gap
Lamont Orange, Chief Information Security Officer at Netskope,  11/15/2019
Attackers' Costs Increasing as Businesses Focus on Security
Robert Lemos, Contributing Writer,  11/15/2019
TPM-Fail: What It Means & What to Do About It
Ari Singer, CTO at TrustPhi,  11/19/2019
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: -when I told you that our cyber-defense was from another age
Current Issue
Navigating the Deluge of Security Data
In this Tech Digest, Dark Reading shares the experiences of some top security practitioners as they navigate volumes of security data. We examine some examples of how enterprises can cull this data to find the clues they need.
Flash Poll
Rethinking Enterprise Data Defense
Rethinking Enterprise Data Defense
Frustrated with recurring intrusions and breaches, cybersecurity professionals are questioning some of the industrys conventional wisdom. Heres a look at what theyre thinking about.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2019-18858
PUBLISHED: 2019-11-20
CODESYS 3 web server before 3.5.15.20, as distributed with CODESYS Control runtime systems, has a Buffer Overflow.
CVE-2019-3466
PUBLISHED: 2019-11-20
The pg_ctlcluster script in postgresql-common in versions prior to 210 didn't drop privileges when creating socket/statistics temporary directories, which could result in local privilege escalation.
CVE-2010-4659
PUBLISHED: 2019-11-20
Cross-site scripting (XSS) vulnerability in statusnet through 2010 in error message contents.
CVE-2019-4530
PUBLISHED: 2019-11-20
IBM Maximo Asset Management 7.6, 7.6.1, and 7.6.1.1 could allow an authenticated user to delete a record that they should not normally be able to. IBM X-Force ID: 165586.
CVE-2019-4561
PUBLISHED: 2019-11-20
IBM Security Identity Manager 6.0.0 could allow a remote attacker to execute arbitrary code on the system, caused by the deserialization of untrusted data. By persuading a victim to visit a specially crafted Web site, an attacker could exploit this vulnerability to execute arbitrary code on the syst...