Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


09:00 AM
Sara Peters
Sara Peters
Connect Directly
E-Mail vvv

We're Still Not Ready for GDPR? What is Wrong With Us?

The canary in the coalmine died 12 years ago, the law went into effect 19 months ago, but many organizations still won't be ready for the new privacy regulations when enforcement begins in May.

If you've been comforting yourself with the thought "I'm sure there will be a grace period for the EU's General Data Protection Regulation," think again, pal, because this is the grace period, and it's almost over. May 25, 2018 enforcement actions for GDPR begin, many if not most of us aren't ready, and we really have no good excuse.

Two out of every five respondents to a new survey released Thursday by Thales stated that they don't believe they'll be fully prepared for GDPR when enforcement actions kick in, specifically 38% of respondents in the UK, 44% in Germany, and 35% in the US. Other recent surveys turn up similar results. Aside from the fact that GDPR officially went into effect in 2016, why is this privacy law and the controls it requests coming as such a surprise? We should have seen this coming from 10 miles and 12 years away. 

The Canary in the Coalmine: ChoicePoint

My first warning came one month after I started covering cybersecurity: the ChoicePoint breach, which occurred in 2004 but wasn't revealed until February 2005. 

The personally identifiable information -- including name, address, and Social Security number -- of 163,000 people was exposed when data broker ChoicePoint (since purchased by Lexis Nexis) sold it to phony businesses set up by an alleged crime ring. Roughly 800 people became victims of identity theft as a result of the incident. ChoicePoint first only notified affected individuals covered under California's young data breach notification law; then later informed victims in other, yet-uncovered states. The Federal Trade Commission fined the company $10 million, plus an additional $5 million to establish a fund for victims.

ChoicePoint was big news precisely because nobody knew who ChoicePoint was. The individuals receiving breach notification letters and suffering from identity theft were not ChoicePoint customers. The company was not a household name. Most people were not aware that companies like ChoicePoint even existed.

The incident showed that, in America at least, individuals do not own their personal data; they don't even hold a penny stock in it. It showed that the organizations that own, buy, and sell that data might do a lousy job of securing it - even when they market themselves as security service providers. 

It wasn't a hack that caused the breach; it was bad business. But the company's chief information security officer suffered a great deal of public criticism just the same, including from those within his own industry.

CISOs everywhere agreed "let's make sure we're not the next ChoicePoint!" And then every company decided to become the next ChoicePoint.

Big Data Revolution  

Now, every company wants to know everything about everyone, everywhere, all the time. Suggest to a marketing or sales person that their company might suceed without that information and they break out in hives and look for the men in white coats to take you away. Nearly every organization now is a steward of some form of sensitive info. 

So, developers made it as easy as possible for people to hand over their personal data: auto-checked "accept" boxes, auto-fill forms, "share on social media" buttons. New business models and job titles centered around getting customers to buy services with data, not cash.

Cybersecurity people knew this could cause trouble. And it did. What we should have known is that the trouble would eventually lead to a reckoning. Every PII breach was a warning. Malvertising was a warning. The plummeting price of credit card numbers on the black market was a warning. Every free cloud IT service, every targeted ad, every one of Facebook Messenger's incessant requests to turn on notifications was a warning. Every time you read the news and wondered "why wasn't that patched," "why wasn't that encrypted," "why was that connected to the internet," "why wasn't that disposed of correctly," "why would they even collect that," was a warning. 

There was a disturbance in The Force. Eventually something had to give. 

GDPR: A New Hope

Bigger, badder data privacy law would have to come, surely. And not just "tattle-tale" data breach notification laws or checklist-happy "set-it-and-forget-it" style regulations coming out of the industry. Eventually the world would call for a law that might genuinely be inspired by the idea that people have a right to privacy.

And of course if such an idea would arise it was going to come from Europe. The Swedes passed their first data protection law in 1973. The European Union issued the GDPR predecessor, the EU Data Protection Directive, in 1995. Discussions for a replacement started in 2009, GDPR was proposed in 2012, approved in 2014, and was officially adopted in 2016. 

Dark Reading has been writing about GDPR for years, but readers didn't take much notice until the Equifax breach.

So we've had at least five years to prepare for a law that went into effect 19 months ago, and we're still not ready. Worse yet, we don't particularly want to be. 

According to the Thales report: "Interestingly, while around one in five (22% in the UK, 24% in Germany, 20% in the US) believe the GDPR will lead to fewer data breaches, a significantly higher proportion (32% in the UK, 31% in Germany, 49% in the US) are concerned that its implementation will actually result in an increased number of breaches."

Half of the Americans surveyed think GDPR will increase the number of breaches.

What a cop-out. 

True, cybersecurity professionals do have legitimate gripes when it comes to privacy/security regulations: Bean-counters who approve budgets may only grant enough beans to achieve compliance, not leaving enough to achieve real security. Regulations may be overly prescriptive with the controls they require, preventing organizations from deploying newer, better security solutions. Compliance efforts prevent organizations from using a risk-based approach to security.

Legitimate. But they don't hold water when it comes to a couple of realities about GDPR.

For one thing, what organizations that use a risk-based approach to data privacy don't understand is that it's not their privacy they're putting at risk. An organization can manage financial fallout of a data breach with effective incident response, good PR and cyber insurance. An individual, however, can't get a job because a background check ordered from a ChoicePoint-lookalike turned up an awful credit score that makes management think "irresponsible" and infosecurity think "insider threat." An individual can't sleep because their crazy ex-partner now knows where their kids go to school. An individual encounters any number of other complications because the religion they practice, the medicines they take, and the mistakes they made are all publicly available to be manipulated. The industry-created standards, oopsy-daisy notification laws, and risk-based enterprise security management strategies we already have don't take any of that into account.

GDPR isn't all that prescriptive on cybersecurity controls, either. Many of the things it does ask for are reasonable and are the same kinds of things that security people have been asking for all along: better data inventory, better data destruction, better monitoring, regular vulnerability testing, principles of least privileges, encryption where necessary, and applications that aren't full of holes. 

We groan and mock developers for writing insecure applications that you have to fix later. Well, GDPR says that applications need to be "secure by design." Now developers have to listen to you. The inventory and destruction mandates should help wipe out your "shadow IT" problem. The security of data processing section of the legislation, where it mentions encryption and pseudonymization, even includes nice flexible risk-friendly language like "implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk." Sure, the 72-hour breach notification requirement is a bit scary. "The right to be forgotten" is a bit scary. But that's only because so many of us have been doing a terrible job to this point.

Businesses are not teenagers living in messy bedrooms, driving recklessly. A business should not hide it from Dad when we lose his credit card or get a parking ticket with his car, lose the ticket, neglect to pay the ticket, and not bother to address it until Dad's gotten nicked for driving with a suspended license. (Sorry, Dad.) When other people give us their stuff, we should know we have it, know where we put it, be able to return it when they want it back and at the very least tell them when we lost it, broke it, gave it to someone else or let it get stolen.

GDPR simply codifies the fact that "personally identifiable information" is someone else's stuff, and should be treated accordingly. If legislation like this actually increases the number of breaches that we have, then we're doing something wrong. If potential penalties of 20 million euros or 4% of our global annual revenue, whichever is higher, don't help us obtain better budgets, then we're doing something wrong. And if, after Equifax and all the other data breaches that don't get covered because there aren't enough employed reporters in the world, we still don't think GDPR is necessary, then we're doing something wrong. 

Related Content:

Join Dark Reading LIVE for two days of practical cyber defense discussions. Learn from the industry’s most knowledgeable IT security experts. Check out the INsecurity agenda here.

Sara Peters is Senior Editor at Dark Reading and formerly the editor-in-chief of Enterprise Efficiency. Prior that she was senior editor for the Computer Security Institute, writing and speaking about virtualization, identity management, cybersecurity law, and a myriad ... View Full Bio

Recommended Reading:

Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
User Rank: Apprentice
7/16/2019 | 9:00:12 AM
2 years later!
2 years after this article is written and we are still no further ahead.
Facebook has just been fined $5Billion dollars and the stock went up in price. One months revenue! 
It is cheaper to pay the fine than to actually do the right thing. 

Facebook is in the business of selling your information. If you are getting the service for free YOU are the product being sold! 
User Rank: Strategist
11/20/2017 | 12:18:25 PM
Paying for Security
Back when I was a wee little lad getting my MBA, we had a class in calculating expected value. Using that 'back of the envelope' type of analysis, if you compare the risk of a E20,000,000 fine PLUS the cost of a E1,000,000 plus cost to remediate a reported breach against the certain expense of security, the stakes are now big enough that most companies, if following a rational(ized) budget process, may be willing to spend what it takes to begin being more secure.

No one says GDPR will prevent breaches. However, the increase in reported breaches will increase until companies (at least those that survive the penalties) wise up and do what is right because it's 'good business' (i.e., economically viable).
COVID-19: Latest Security News & Commentary
Dark Reading Staff 9/25/2020
Hacking Yourself: Marie Moe and Pacemaker Security
Gary McGraw Ph.D., Co-founder Berryville Institute of Machine Learning,  9/21/2020
Startup Aims to Map and Track All the IT and Security Things
Kelly Jackson Higgins, Executive Editor at Dark Reading,  9/22/2020
Register for Dark Reading Newsletters
White Papers
Current Issue
Special Report: Computing's New Normal
This special report examines how IT security organizations have adapted to the "new normal" of computing and what the long-term effects will be. Read it and get a unique set of perspectives on issues ranging from new threats & vulnerabilities as a result of remote working to how enterprise security strategy will be affected long term.
Flash Poll
How IT Security Organizations are Attacking the Cybersecurity Problem
How IT Security Organizations are Attacking the Cybersecurity Problem
The COVID-19 pandemic turned the world -- and enterprise computing -- on end. Here's a look at how cybersecurity teams are retrenching their defense strategies, rebuilding their teams, and selecting new technologies to stop the oncoming rise of online attacks.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
PUBLISHED: 2020-09-25
In tensorflow-lite before versions 1.15.4, 2.0.3, 2.1.2, 2.2.1 and 2.3.1, when determining the common dimension size of two tensors, TFLite uses a `DCHECK` which is no-op outside of debug compilation modes. Since the function always returns the dimension of the first tensor, malicious attackers can ...
PUBLISHED: 2020-09-25
In tensorflow-lite before versions 1.15.4, 2.0.3, 2.1.2, 2.2.1 and 2.3.1, a crafted TFLite model can force a node to have as input a tensor backed by a `nullptr` buffer. This can be achieved by changing a buffer index in the flatbuffer serialization to convert a read-only tensor to a read-write one....
PUBLISHED: 2020-09-25
In tensorflow-lite before versions 1.15.4, 2.0.3, 2.1.2, 2.2.1 and 2.3.1, if a TFLite saved model uses the same tensor as both input and output of an operator, then, depending on the operator, we can observe a segmentation fault or just memory corruption. We have patched the issue in d58c96946b and ...
PUBLISHED: 2020-09-25
In TensorFlow Lite before versions 1.15.4, 2.0.3, 2.1.2, 2.2.1 and 2.3.1, saved models in the flatbuffer format use a double indexing scheme: a model has a set of subgraphs, each subgraph has a set of operators and each operator has a set of input/output tensors. The flatbuffer format uses indices f...
PUBLISHED: 2020-09-25
In TensorFlow Lite before versions 2.2.1 and 2.3.1, models using segment sum can trigger writes outside of bounds of heap allocated buffers by inserting negative elements in the segment ids tensor. Users having access to `segment_ids_data` can alter `output_index` and then write to outside of `outpu...