Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Perimeter

6/8/2010
11:49 AM
Adrian Lane
Adrian Lane
Commentary
50%
50%

Massachusetts Data Privacy Standard: Comply Or Not?

In my previous position at a database security vendor, I was often asked by marketing to explain the applicability of technology to problems: how you could use assessment for PCI compliance, or why database activity monitoring was applicable to privacy laws, for example.

In my previous position at a database security vendor, I was often asked by marketing to explain the applicability of technology to problems: how you could use assessment for PCI compliance, or why database activity monitoring was applicable to privacy laws, for example.Invariably, the papers I wrote mirrored what every vendor does: create a logical explanation of how bits and parts of the security product addresses compliance challenges. But however logical, this misses the real-world issue of motivation.

A given security widget may genuinely address a regulation. It may even be the best method to meet compliance, but that is irrelevant if the customer does not view compliance as mandatory. Why would they spend money on a product to address a problem they don't acknowledge?

It is for these reasons that I make fun of myself when reading my older white papers, as I now ridicule all vendor justification papers: The grand philosophical arguments are trumped by business realities.

I got to thinking about this when the Massachusetts data privacy standard 201 CMR 17.00 was put into effect in March. Every vendor rolled out its perspective on how it addresses the "challenges" of this new regulatory requirement. Customer questions about the importance of this regulation -- how they might comply with the law, and how they estimate what they are willing to pay for solutions -- are still very much up in the air.

I use Sarbanes-Oxley as an example to illustrate what I mean. In 2005, I learned directly from a dozen Fortune 500 CEOs that they had no intention of complying with SOX: They would first wait to see whether the law would be repealed. If it was not repealed, then they would wait to see whether the regulation was going to be enforced. If it was enforced, then they'd weigh the fines versus the additional auditing costs. Would it be cheaper to legally challenge the law? They would re-evaluate two years down the road, and if they found they needed to comply, then they would throw up their hands, say "mea culpa," and provide their remediation plans.

There were layers of business risk analysis that occurred long before they dug into understanding and reporting financial accounting risks. HIPAA was no different. I am betting Massachusetts data privacy standard 201 CMR 17.00 falls into the same category of "wait and see."

Companies will wait for a while before starting a compliance program. Once a business conducts the evaluation of its security risks to sensitive data, it's somewhat compelled to act if it finds something bad. It's one thing to claim you were not sure if there was a risk or did not think the law applied to you. But it looks really bad in court, after a breach has occurred, if you were aware of the risks and did nothing. So the mindset is that it is best to remain ignorant altogether.

The logical part of my brain loves that the Massachusetts privacy law tasks companies with understanding their data security risks. Risk is something they understand. The regulation specified that you will build a security plan around challenges specific to your organization's risk, looking at both the data and systems that manage data. There is no encryption loophole for companies to excuse sloppy security. They will have a tough time blaming data theft on product vendors if they select the wrong product or scapegoat IT if they do not respond to discovered risks. I think this law makes meaningful advancements in data security requirements, and I like to think that companies that make money from personal information should have some custodial responsibilities to the safety and security of other people's information.

So I was thinking, "Great, I'll write a justification piece for database security." In practice, social security numbers, driver's license numbers, passwords, and other personal information are being stored in relational databases. 201 CMR 17.00, HIPAA, and many of the regulations don't call out database security specifically, but relational databases are the primary storage management system for this data. My intention when starting this blog post was to outline a plan to understand risk, and outline how auditing, assessment, DAM, encryption, access controls, and application programming can be applied to meet different threat types. I may still do that, but the question remains whether companies will actually take action to understand the risks. Is this law actually a business driver?

If you want to see a security program outline for meeting Massachusetts data privacy standard 201 CMR 17.00, then I would be happy to post here on the blog. (Let me know via the "Comments" box below).

But I think a discussion about whether this law will be taken seriously needs to happen before we decide how to comply.

Adrian Lane is an analyst/CTO with Securosis LLC, an independent security consulting practice. Special to Dark Reading. Adrian Lane is a Security Strategist and brings over 25 years of industry experience to the Securosis team, much of it at the executive level. Adrian specializes in database security, data security, and secure software development. With experience at Ingres, Oracle, and ... View Full Bio

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
COVID-19: Latest Security News & Commentary
Dark Reading Staff 9/25/2020
Hacking Yourself: Marie Moe and Pacemaker Security
Gary McGraw Ph.D., Co-founder Berryville Institute of Machine Learning,  9/21/2020
Startup Aims to Map and Track All the IT and Security Things
Kelly Jackson Higgins, Executive Editor at Dark Reading,  9/22/2020
Register for Dark Reading Newsletters
White Papers
Video
Cartoon
Current Issue
Special Report: Computing's New Normal
This special report examines how IT security organizations have adapted to the "new normal" of computing and what the long-term effects will be. Read it and get a unique set of perspectives on issues ranging from new threats & vulnerabilities as a result of remote working to how enterprise security strategy will be affected long term.
Flash Poll
How IT Security Organizations are Attacking the Cybersecurity Problem
How IT Security Organizations are Attacking the Cybersecurity Problem
The COVID-19 pandemic turned the world -- and enterprise computing -- on end. Here's a look at how cybersecurity teams are retrenching their defense strategies, rebuilding their teams, and selecting new technologies to stop the oncoming rise of online attacks.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2020-15208
PUBLISHED: 2020-09-25
In tensorflow-lite before versions 1.15.4, 2.0.3, 2.1.2, 2.2.1 and 2.3.1, when determining the common dimension size of two tensors, TFLite uses a `DCHECK` which is no-op outside of debug compilation modes. Since the function always returns the dimension of the first tensor, malicious attackers can ...
CVE-2020-15209
PUBLISHED: 2020-09-25
In tensorflow-lite before versions 1.15.4, 2.0.3, 2.1.2, 2.2.1 and 2.3.1, a crafted TFLite model can force a node to have as input a tensor backed by a `nullptr` buffer. This can be achieved by changing a buffer index in the flatbuffer serialization to convert a read-only tensor to a read-write one....
CVE-2020-15210
PUBLISHED: 2020-09-25
In tensorflow-lite before versions 1.15.4, 2.0.3, 2.1.2, 2.2.1 and 2.3.1, if a TFLite saved model uses the same tensor as both input and output of an operator, then, depending on the operator, we can observe a segmentation fault or just memory corruption. We have patched the issue in d58c96946b and ...
CVE-2020-15211
PUBLISHED: 2020-09-25
In TensorFlow Lite before versions 1.15.4, 2.0.3, 2.1.2, 2.2.1 and 2.3.1, saved models in the flatbuffer format use a double indexing scheme: a model has a set of subgraphs, each subgraph has a set of operators and each operator has a set of input/output tensors. The flatbuffer format uses indices f...
CVE-2020-15212
PUBLISHED: 2020-09-25
In TensorFlow Lite before versions 2.2.1 and 2.3.1, models using segment sum can trigger writes outside of bounds of heap allocated buffers by inserting negative elements in the segment ids tensor. Users having access to `segment_ids_data` can alter `output_index` and then write to outside of `outpu...