Endpoint

2/2/2016
10:30 AM
Scott Petry
Scott Petry
Commentary
Connect Directly
Twitter
RSS
E-Mail vvv
50%
50%

As Good As They're Getting, Analytics Don't Inherently Protect Data

It is only a matter of time before your system is breached, and when your data is lost, analytics won't help you.

“Use analytics to secure your systems!” This phrase is becoming more common in today’s over-hyped security industry. Certainly the tools continue to improve with real-time data collection, large scale manipulation and advanced visualization.  And marketing terminology keeps pace with  “machine learning,” “predictive analytics,” and “Big Data.” These innovations deliver better insight into complex systems, and there is no question that they play a role in managing networks and resources. But analytics is just a fancy word for monitoring. 

The suggestion to “use analytics to secure your system” is flawed, and the argument to shift away from data security systems like encryption and move to analytics is fallacious. In fact, analytics is not an either-or-choice with encryption. Suggesting that firms choose between the two is like a doctor telling a patient to choose either vitamins or exercise. Both have their place in a healthy lifestyle. 

As good as they are getting, analytics does not inherently protect data. Therefore it is unequivocally prudent to use encryption. 

[COUNTERPOINT:  Encryption Has Its Place But It Isn’t Foolproof by Doug Clare, Vice President of Product Management, FICO]

Let’s say a healthcare company processes HIPAA data and experiences a breach. A well-functioning analytics package would be able to tell the incident team how the breach happened and what data was leaked. It might have even alerted teams that they were subject to an attack so they could respond. But if data leaked, the company’s remediation steps are vastly different depending on whether or not they encrypted their data. Pick any victim of one of 2015’s high visibility data breaches - from Anthem to OPM - and re-think the impact if that data had been properly encrypted. 

Encryption is designed to work best when unauthorized parties attempt to access data. Encryption is a last line of defense to protect data once the breach has occurred. And be sure about it, breaches will occur.

Not if, but when

Prudent infosec professionals think about security in the context of reducing threat surface area and minimizing damage in the event of the exploit. The joke used to be that the only secure system is the one that isn’t connected to the Internet. But with the spate of embedded systems exploits - like the firmware hack against hard disk controllers -- that joke needed an update.  Experts agree: there is no such thing as a completely secure system.

If you share this same perspective as the professionals, then much of your risk assessment must focus on what happens post-breach.   

Quantifying your risk is where the true value of encryption comes into play.  Your risk profile is determined by the cost of data loss. Whether reputational damage, loss of certifications or accreditations, or other legal clean-up fees, the usefulness of the data to the bad guys will determine the level of risk to your organization. Encryption minimizes this risk by reducing the usefulness of the data. Analytics does not do that.

While encryption is the best and last line of defense in the event of a breach, the sad truth is that it isn’t as mainstream as it should be. The underlying methods are decades old and relatively straightforward to implement, but industry has been unable or unwilling to deploy it.  

That isn’t because they’re spending their time monitoring and remediating systems. In fact, most organizations don’t follow the basic processes that would protect them from obvious threats.  The Sony Pictures breach from 2014 highlights this perfectly. It was nine months after the breach that the hackers told them they had their data. During that nine-month period, the bad guys poked around the SPE network, installed an egress server in the Sony DMZ, and extracted terabytes of data. Basic system and network monitoring should have flagged these anomalous activities. Why didn’t Sony know?

Similarly, congressional testimony following the OPM breach showed that they’d been informed of the vulnerability of their systems for years, up to and including specific recommendations by the Inspector General, which were summarily ignored.

Organizations don’t do well with repetitive, mundane tasks like monitoring, interpreting, and remediating. It’s kind of like flossing your teeth: even the most committed among us will skip a day or two. With more sophisticated analytics tools generating more real-time data, you need more sophisticated people and processes, which are expensive. That is why outsourcing the SOC is becoming more common.

The human requirement

The analytics side would have you believe that “machine learning” will reduce or eliminate the burden on IT. No technology is so advanced that it is fully autonomous -- and it would be dangerous to assume so. But it can reduce the burden. 

Systems need to be trained by humans using data that represents the ecosystem well enough that anomalies can be identified. That’s hard, since 25% of all data breaches are due to human error. Those systems then need to be maintained by humans in order to balance the false-positive/missed-negative tradeoffs, and they’ll need to be re-trained as underlying systems change.

Even the best predictive models still require grey matter analysis. We saw this at my last company, Postini, where our “machine learning system” processed more than three billion email messages per day. Each message would have a set of attributes we’d use to identify legit email separately from spam, phishing, and virus-laden messages. As good as our systems were and as diverse as the training sample was, we still needed humans performing around-the-clock review and adjust in order to stay current. 

That was just for email threats. The diversity of signals generated across Web services, data stores, and applications is orders of magnitude more complicated than an email payload.  Machines cannot auto-detect and auto-resolve.

From a total cost of ownership perspective, I’d also assert that implementing robust encryption is less expensive and operationally simpler than relying on analytics. Encryption requires specialized skills and a re-factoring of data, data stores, and applications. That work is done by software engineering teams through standard software development processes. It doesn’t require the difficult cultural changes and the 24x7 attention that are necessary to make full use of analytics.

It is only a matter of time before your system is breached, and when your data is lost, analytics won’t help you. Would you prefer that your data was out there in the clear, or encrypted?

More on this topic: 

Scott Petry is Co-Founder and CEO of Authentic8. Prior to Authentic8, Scott founded Postini and served in a variety of C-level roles until its acquisition by Google in 2007. He served as Director of Product Management at Google until 2009. Prior to Postini, Scott was General ... View Full Bio
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Crowdsourced vs. Traditional Pen Testing
Alex Haynes, Chief Information Security Officer, CDL,  3/19/2019
BEC Scammer Pleads Guilty
Dark Reading Staff 3/20/2019
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: This comment is waiting for review by our moderators.
Current Issue
5 Emerging Cyber Threats to Watch for in 2019
Online attackers are constantly developing new, innovative ways to break into the enterprise. This Dark Reading Tech Digest gives an in-depth look at five emerging attack trends and exploits your security team should look out for, along with helpful recommendations on how you can prevent your organization from falling victim.
Flash Poll
The State of Cyber Security Incident Response
The State of Cyber Security Incident Response
Organizations are responding to new threats with new processes for detecting and mitigating them. Here's a look at how the discipline of incident response is evolving.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2018-18913
PUBLISHED: 2019-03-21
Opera before 57.0.3098.106 is vulnerable to a DLL Search Order hijacking attack where an attacker can send a ZIP archive composed of an HTML page along with a malicious DLL to the target. Once the document is opened, it may allow the attacker to take full control of the system from any location with...
CVE-2018-20031
PUBLISHED: 2019-03-21
A Denial of Service vulnerability related to preemptive item deletion in lmgrd and vendor daemon components of FlexNet Publisher version 11.16.1.0 and earlier allows a remote attacker to send a combination of messages to lmgrd or the vendor daemon, causing the heartbeat between lmgrd and the vendor ...
CVE-2018-20032
PUBLISHED: 2019-03-21
A Denial of Service vulnerability related to message decoding in lmgrd and vendor daemon components of FlexNet Publisher version 11.16.1.0 and earlier allows a remote attacker to send a combination of messages to lmgrd or the vendor daemon, causing the heartbeat between lmgrd and the vendor daemon t...
CVE-2018-20034
PUBLISHED: 2019-03-21
A Denial of Service vulnerability related to adding an item to a list in lmgrd and vendor daemon components of FlexNet Publisher version 11.16.1.0 and earlier allows a remote attacker to send a combination of messages to lmgrd or the vendor daemon, causing the heartbeat between lmgrd and the vendor ...
CVE-2019-3855
PUBLISHED: 2019-03-21
An integer overflow flaw which could lead to an out of bounds write was discovered in libssh2 before 1.8.1 in the way packets are read from the server. A remote attacker who compromises a SSH server may be able to execute code on the client system when a user connects to the server.