The focus on digital transformation and compressing development release cycles is appealing, but that means security can be left behind. How should security practitioners address this challenge?

Ameesh Divatia, Co-Founder & CEO of Baffle

October 2, 2019

5 Min Read

Enterprises are increasingly using the cloud for their test and development spin-ups, to make development instances quickly available for their application development teams. This process empowers organizations to assume a faster release cycle, use DevOps, and enjoy the nimbleness and flexibility of the cloud. But it also introduces increased security and privacy risks when developers migrate sensitive data, in the clear, into the cloud.

A basic cloud migration flow to support this development model involves a source database of production data and a non-production or development target database that developers can use to build and test their applications against. Sometimes, there are tools available to help you move large data sets to Amazon Web Services (AWS) or Azure using database migration services. Inevitably, using clones of production data can create a replica of sensitive information in a cloud database in the clear that may violate security and compliance policies or may not receive the same level of security that a production environment typically receives.

The downstream effect is that a non-production deployment without protection may be left running, and as a result, an attacker or Internet scanner may uncover cleartext records. Over the past few years, there have been several instances of open or non-production environments having their data exposed.

The Security Challenges of This New Reality
In a cloud migration, developers as well as database administrators can access all of your data, in the clear, even if you are using an at-rest encryption solution or tablespace/transparent data encryption to protect that data.

Furthermore, several recent data exposures have been linked to unattended or misconfigured cloud deployments. Last year's Starwood/Marriott breach involved about 339 million records — and may draw a $124 million fine under the General Data Protection Regulation (GDPR).

First American Financial Corporation leaked 885 million records starting as far back as 2003.

And more recently, there is the example of Capital One's AWS deployment and a misconfigured web application firewall. In that scenario, an unauthorized user was able to access sensitive data records, putting 106 million customers at risk.

Transparent data encryption (TDE) was unable to protect these companies or their users because it was never meant to do so. It was designed to protect against what we call the "Tom Cruise threat model" — where someone breaks into a data center and drops down from the ceiling (as in Mission Impossible) to steal disks that hold your data. The reality is that hackers aren't physically breaking into data centers in today's world. They are hacking using compromised credentials and then moving laterally in your environment. Encrypting the physical disks or using database encryption at-rest does nothing to protect the data from these modern-day attacks.

The "shared responsibility model" requires users to secure everything in the cloud, while the cloud provider ensures the security of the cloud. One cannot blame a cloud provider for open or misconfigured buckets. That has been and will always be the cloud user's responsibility.

New Attacks Require New Methods of Defense
It is clear  that traditional approaches (such as encrypting data at rest and in motion) are no longer enough to protect against new methods of attacks, particularly as developers spin up and migrate to cloud test and dev environments.

For far too long, security practitioners have used these technologies as a "check the box" method to achieve compliance. Modern attacks, however, require us to rethink our processes to defend what is most important: the data itself, not the systems or perimeter defenses surrounding it.

I'm encouraged in particular by MongoDB's announcement in June 2019 that it will begin to implement "field-level encryption," which enables users to have "encrypted fields on the server — stored in-memory, in system logs, at-rest and in backups — which are rendered as ciphertext, making them unreadable to any party who does not have client access or the keys necessary to decrypt the data." While it would allow limited operations on that encrypted data, it certainly is a step in the right direction. More companies in security should recognize that the traditional approach to encryption is inadequate to defend what's most important.

Proactive, Not Reactive, Measures
To prevent a significant amount of data breaches, which trigger significant regulatory fines, why not nip this issue in the bud? Regulators would do well to expand mandates to encrypt data in the entire environment from "at rest" and "in motion" only to also include "in memory" and "in use." Doing so would prevent some data breaches — especially in the cases described above — negative headlines, legal and reputational issues, and regulatory fines before they occur. Building encryption into the data migration process would protect sensitive data at all times, automatically, preventing inadvertent exposure.

The consequences of inaction are growing significantly, as regulators have clearly caught on to the importance of data privacy. Just look at the $5 billion Federal Trade Commission fine against Facebook for its failure to protect data from abusive third parties (such as Cambridge Analytica). Five billion dollars is about 10% of the company's 2018 revenue, and 20% of its 2018 profits.

GDPR, which went into effect in mid-2018, sparked a wave of new data privacy regulations in the US. The most significant of these is the California Consumer Privacy Act, which provides unprecedented power for consumers to control the collection, use, and transfer of their data. Up to 40 other states are also in various stages of implementing data privacy regulations.

Putting it All Together
As I wrote in a previous Dark Reading column, security should not be a bottleneck and slow down business functions. When done correctly, security can actually empower a business and create a sustainable competitive advantage.

Related Content:

Check out The Edge, Dark Reading's new section for features, threat data, and in-depth perspectives. Today's top story: "The Inestimable Values of an Attacker's Mindset & Alex Trebek."

About the Author(s)

Ameesh Divatia

Co-Founder & CEO of Baffle

Ameesh Divatia is Co-Founder & CEO of Baffle, Inc., which provides encryption as a service. He has a proven track record of turning technologies that are difficult to build into successful businesses, selling three companies for more than $425 million combined in the service provider and enterprise data center infrastructure market.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights