The canary in the coalmine died 12 years ago, the law went into effect 19 months ago, but many organizations still won't be ready for the new privacy regulations when enforcement begins in May.

Sara Peters, Senior Editor

November 17, 2017

8 Min Read

If you've been comforting yourself with the thought "I'm sure there will be a grace period for the EU's General Data Protection Regulation," think again, pal, because this is the grace period, and it's almost over. May 25, 2018 enforcement actions for GDPR begin, many if not most of us aren't ready, and we really have no good excuse.

Two out of every five respondents to a new survey released Thursday by Thales stated that they don't believe they'll be fully prepared for GDPR when enforcement actions kick in, specifically 38% of respondents in the UK, 44% in Germany, and 35% in the US. Other recent surveys turn up similar results. Aside from the fact that GDPR officially went into effect in 2016, why is this privacy law and the controls it requests coming as such a surprise? We should have seen this coming from 10 miles and 12 years away. 

The Canary in the Coalmine: ChoicePoint

My first warning came one month after I started covering cybersecurity: the ChoicePoint breach, which occurred in 2004 but wasn't revealed until February 2005. 

The personally identifiable information -- including name, address, and Social Security number -- of 163,000 people was exposed when data broker ChoicePoint (since purchased by Lexis Nexis) sold it to phony businesses set up by an alleged crime ring. Roughly 800 people became victims of identity theft as a result of the incident. ChoicePoint first only notified affected individuals covered under California's young data breach notification law; then later informed victims in other, yet-uncovered states. The Federal Trade Commission fined the company $10 million, plus an additional $5 million to establish a fund for victims.

ChoicePoint was big news precisely because nobody knew who ChoicePoint was. The individuals receiving breach notification letters and suffering from identity theft were not ChoicePoint customers. The company was not a household name. Most people were not aware that companies like ChoicePoint even existed.

The incident showed that, in America at least, individuals do not own their personal data; they don't even hold a penny stock in it. It showed that the organizations that own, buy, and sell that data might do a lousy job of securing it - even when they market themselves as security service providers. 

It wasn't a hack that caused the breach; it was bad business. But the company's chief information security officer suffered a great deal of public criticism just the same, including from those within his own industry.

CISOs everywhere agreed "let's make sure we're not the next ChoicePoint!" And then every company decided to become the next ChoicePoint.

Big Data Revolution  

Now, every company wants to know everything about everyone, everywhere, all the time. Suggest to a marketing or sales person that their company might suceed without that information and they break out in hives and look for the men in white coats to take you away. Nearly every organization now is a steward of some form of sensitive info. 

So, developers made it as easy as possible for people to hand over their personal data: auto-checked "accept" boxes, auto-fill forms, "share on social media" buttons. New business models and job titles centered around getting customers to buy services with data, not cash.

Cybersecurity people knew this could cause trouble. And it did. What we should have known is that the trouble would eventually lead to a reckoning. Every PII breach was a warning. Malvertising was a warning. The plummeting price of credit card numbers on the black market was a warning. Every free cloud IT service, every targeted ad, every one of Facebook Messenger's incessant requests to turn on notifications was a warning. Every time you read the news and wondered "why wasn't that patched," "why wasn't that encrypted," "why was that connected to the internet," "why wasn't that disposed of correctly," "why would they even collect that," was a warning. 

There was a disturbance in The Force. Eventually something had to give. 

GDPR: A New Hope

Bigger, badder data privacy law would have to come, surely. And not just "tattle-tale" data breach notification laws or checklist-happy "set-it-and-forget-it" style regulations coming out of the industry. Eventually the world would call for a law that might genuinely be inspired by the idea that people have a right to privacy.

And of course if such an idea would arise it was going to come from Europe. The Swedes passed their first data protection law in 1973. The European Union issued the GDPR predecessor, the EU Data Protection Directive, in 1995. Discussions for a replacement started in 2009, GDPR was proposed in 2012, approved in 2014, and was officially adopted in 2016. 

Dark Reading has been writing about GDPR for years, but readers didn't take much notice until the Equifax breach.

So we've had at least five years to prepare for a law that went into effect 19 months ago, and we're still not ready. Worse yet, we don't particularly want to be. 

According to the Thales report: "Interestingly, while around one in five (22% in the UK, 24% in Germany, 20% in the US) believe the GDPR will lead to fewer data breaches, a significantly higher proportion (32% in the UK, 31% in Germany, 49% in the US) are concerned that its implementation will actually result in an increased number of breaches."

Half of the Americans surveyed think GDPR will increase the number of breaches.

What a cop-out. 

True, cybersecurity professionals do have legitimate gripes when it comes to privacy/security regulations: Bean-counters who approve budgets may only grant enough beans to achieve compliance, not leaving enough to achieve real security. Regulations may be overly prescriptive with the controls they require, preventing organizations from deploying newer, better security solutions. Compliance efforts prevent organizations from using a risk-based approach to security.

Legitimate. But they don't hold water when it comes to a couple of realities about GDPR.

For one thing, what organizations that use a risk-based approach to data privacy don't understand is that it's not their privacy they're putting at risk. An organization can manage financial fallout of a data breach with effective incident response, good PR and cyber insurance. An individual, however, can't get a job because a background check ordered from a ChoicePoint-lookalike turned up an awful credit score that makes management think "irresponsible" and infosecurity think "insider threat." An individual can't sleep because their crazy ex-partner now knows where their kids go to school. An individual encounters any number of other complications because the religion they practice, the medicines they take, and the mistakes they made are all publicly available to be manipulated. The industry-created standards, oopsy-daisy notification laws, and risk-based enterprise security management strategies we already have don't take any of that into account.

GDPR isn't all that prescriptive on cybersecurity controls, either. Many of the things it does ask for are reasonable and are the same kinds of things that security people have been asking for all along: better data inventory, better data destruction, better monitoring, regular vulnerability testing, principles of least privileges, encryption where necessary, and applications that aren't full of holes. 

We groan and mock developers for writing insecure applications that you have to fix later. Well, GDPR says that applications need to be "secure by design." Now developers have to listen to you. The inventory and destruction mandates should help wipe out your "shadow IT" problem. The security of data processing section of the legislation, where it mentions encryption and pseudonymization, even includes nice flexible risk-friendly language like "implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk." Sure, the 72-hour breach notification requirement is a bit scary. "The right to be forgotten" is a bit scary. But that's only because so many of us have been doing a terrible job to this point.

Businesses are not teenagers living in messy bedrooms, driving recklessly. A business should not hide it from Dad when we lose his credit card or get a parking ticket with his car, lose the ticket, neglect to pay the ticket, and not bother to address it until Dad's gotten nicked for driving with a suspended license. (Sorry, Dad.) When other people give us their stuff, we should know we have it, know where we put it, be able to return it when they want it back and at the very least tell them when we lost it, broke it, gave it to someone else or let it get stolen.

GDPR simply codifies the fact that "personally identifiable information" is someone else's stuff, and should be treated accordingly. If legislation like this actually increases the number of breaches that we have, then we're doing something wrong. If potential penalties of 20 million euros or 4% of our global annual revenue, whichever is higher, don't help us obtain better budgets, then we're doing something wrong. And if, after Equifax and all the other data breaches that don't get covered because there aren't enough employed reporters in the world, we still don't think GDPR is necessary, then we're doing something wrong. 

Related Content:

Join Dark Reading LIVE for two days of practical cyber defense discussions. Learn from the industry’s most knowledgeable IT security experts. Check out the INsecurity agenda here.

About the Author(s)

Sara Peters

Senior Editor

Sara Peters is Senior Editor at Dark Reading and formerly the editor-in-chief of Enterprise Efficiency. Prior that she was senior editor for the Computer Security Institute, writing and speaking about virtualization, identity management, cybersecurity law, and a myriad of other topics. She authored the 2009 CSI Computer Crime and Security Survey and founded the CSI Working Group on Web Security Research Law -- a collaborative project that investigated the dichotomy between laws regulating software vulnerability disclosure and those regulating Web vulnerability disclosure.


Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights