Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Cloud

10/2/2019
10:00 AM
Ameesh Divatia
Ameesh Divatia
Commentary
Connect Directly
LinkedIn
RSS
E-Mail vvv
50%
50%

Controlling Data Leakage in Cloud Test-Dev Environments

The focus on digital transformation and compressing development release cycles is appealing, but that means security can be left behind. How should security practitioners address this challenge?

Enterprises are increasingly using the cloud for their test and development spin-ups, to make development instances quickly available for their application development teams. This process empowers organizations to assume a faster release cycle, use DevOps, and enjoy the nimbleness and flexibility of the cloud. But it also introduces increased security and privacy risks when developers migrate sensitive data, in the clear, into the cloud.

A basic cloud migration flow to support this development model involves a source database of production data and a non-production or development target database that developers can use to build and test their applications against. Sometimes, there are tools available to help you move large data sets to Amazon Web Services (AWS) or Azure using database migration services. Inevitably, using clones of production data can create a replica of sensitive information in a cloud database in the clear that may violate security and compliance policies or may not receive the same level of security that a production environment typically receives.

The downstream effect is that a non-production deployment without protection may be left running, and as a result, an attacker or Internet scanner may uncover cleartext records. Over the past few years, there have been several instances of open or non-production environments having their data exposed.

The Security Challenges of This New Reality
In a cloud migration, developers as well as database administrators can access all of your data, in the clear, even if you are using an at-rest encryption solution or tablespace/transparent data encryption to protect that data.

Furthermore, several recent data exposures have been linked to unattended or misconfigured cloud deployments. Last year's Starwood/Marriott breach involved about 339 million records — and may draw a $124 million fine under the General Data Protection Regulation (GDPR).

First American Financial Corporation leaked 885 million records starting as far back as 2003.

And more recently, there is the example of Capital One's AWS deployment and a misconfigured web application firewall. In that scenario, an unauthorized user was able to access sensitive data records, putting 106 million customers at risk.

Transparent data encryption (TDE) was unable to protect these companies or their users because it was never meant to do so. It was designed to protect against what we call the "Tom Cruise threat model" — where someone breaks into a data center and drops down from the ceiling (as in Mission Impossible) to steal disks that hold your data. The reality is that hackers aren't physically breaking into data centers in today's world. They are hacking using compromised credentials and then moving laterally in your environment. Encrypting the physical disks or using database encryption at-rest does nothing to protect the data from these modern-day attacks.

The "shared responsibility model" requires users to secure everything in the cloud, while the cloud provider ensures the security of the cloud. One cannot blame a cloud provider for open or misconfigured buckets. That has been and will always be the cloud user's responsibility.

New Attacks Require New Methods of Defense
It is clear  that traditional approaches (such as encrypting data at rest and in motion) are no longer enough to protect against new methods of attacks, particularly as developers spin up and migrate to cloud test and dev environments.

For far too long, security practitioners have used these technologies as a "check the box" method to achieve compliance. Modern attacks, however, require us to rethink our processes to defend what is most important: the data itself, not the systems or perimeter defenses surrounding it.

I'm encouraged in particular by MongoDB's announcement in June 2019 that it will begin to implement "field-level encryption," which enables users to have "encrypted fields on the server — stored in-memory, in system logs, at-rest and in backups — which are rendered as ciphertext, making them unreadable to any party who does not have client access or the keys necessary to decrypt the data." While it would allow limited operations on that encrypted data, it certainly is a step in the right direction. More companies in security should recognize that the traditional approach to encryption is inadequate to defend what's most important.

Proactive, Not Reactive, Measures
To prevent a significant amount of data breaches, which trigger significant regulatory fines, why not nip this issue in the bud? Regulators would do well to expand mandates to encrypt data in the entire environment from "at rest" and "in motion" only to also include "in memory" and "in use." Doing so would prevent some data breaches — especially in the cases described above — negative headlines, legal and reputational issues, and regulatory fines before they occur. Building encryption into the data migration process would protect sensitive data at all times, automatically, preventing inadvertent exposure.

The consequences of inaction are growing significantly, as regulators have clearly caught on to the importance of data privacy. Just look at the $5 billion Federal Trade Commission fine against Facebook for its failure to protect data from abusive third parties (such as Cambridge Analytica). Five billion dollars is about 10% of the company's 2018 revenue, and 20% of its 2018 profits.

GDPR, which went into effect in mid-2018, sparked a wave of new data privacy regulations in the US. The most significant of these is the California Consumer Privacy Act, which provides unprecedented power for consumers to control the collection, use, and transfer of their data. Up to 40 other states are also in various stages of implementing data privacy regulations.

Putting it All Together
As I wrote in a previous Dark Reading column, security should not be a bottleneck and slow down business functions. When done correctly, security can actually empower a business and create a sustainable competitive advantage.

Related Content:

Check out The Edge, Dark Reading's new section for features, threat data, and in-depth perspectives. Today's top story: "The Inestimable Values of an Attacker's Mindset & Alex Trebek."

Ameesh Divatia is Co-Founder & CEO of Baffle, Inc., which provides encryption as a service. He has a proven track record of turning technologies that are difficult to build into successful businesses, selling three companies for more than $425 million combined in the service ... View Full Bio
 

Recommended Reading:

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
COVID-19: Latest Security News & Commentary
Dark Reading Staff 6/4/2020
Abandoned Apps May Pose Security Risk to Mobile Devices
Robert Lemos, Contributing Writer,  5/29/2020
Cybersecurity Spending Hits 'Temporary Pause' Amid Pandemic
Kelly Jackson Higgins, Executive Editor at Dark Reading,  6/2/2020
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: What? IT said I needed virus protection!
Current Issue
How Cybersecurity Incident Response Programs Work (and Why Some Don't)
This Tech Digest takes a look at the vital role cybersecurity incident response (IR) plays in managing cyber-risk within organizations. Download the Tech Digest today to find out how well-planned IR programs can detect intrusions, contain breaches, and help an organization restore normal operations.
Flash Poll
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2020-13817
PUBLISHED: 2020-06-04
ntpd in ntp before 4.2.8p14 and 4.3.x before 4.3.100 allows remote attackers to cause a denial of service (daemon exit or system time change) by predicting transmit timestamps for use in spoofed packets. The victim must be relying on unauthenticated IPv4 time sources. There must be an off-path attac...
CVE-2020-13818
PUBLISHED: 2020-06-04
In Zoho ManageEngine OpManager before 125144, when <cachestart> is used, directory traversal validation can be bypassed.
CVE-2020-6640
PUBLISHED: 2020-06-04
An improper neutralization of input vulnerability in the Admin Profile of FortiAnalyzer may allow a remote authenticated attacker to perform a stored cross site scripting attack (XSS) via the Description Area.
CVE-2020-9292
PUBLISHED: 2020-06-04
An unquoted service path vulnerability in the FortiSIEM Windows Agent component may allow an attacker to gain elevated privileges via the AoWinAgt executable service path.
CVE-2019-16150
PUBLISHED: 2020-06-04
Use of a hard-coded cryptographic key to encrypt security sensitive data in local storage and configuration in FortiClient for Windows prior to 6.4.0 may allow an attacker with access to the local storage or the configuration backup file to decrypt the sensitive data via knowledge of the hard-coded ...