Vulnerabilities / Threats
6/13/2014
12:00 PM
Martin McKeay
Martin McKeay
Commentary
50%
50%

Heartbleed & The Long Tail Of Vulnerabilities

To this day there are still unpatched systems, still hackers scanning for vulnerable systems, and still cyber criminals using Heartbleed every day to break into companies.

Next week is London Technology Week in the UK, where innovative businesses will be showing off their best and brightest. Thursday afternoon, I’ll be talking about how the Heartbleed vulnerability is going to be with us for a long time as we continue to find it in the forgotten spaces of our networks.

It’s been just over two months since the disclosure of the Heartbleed vulnerability in OpenSSL -- one of the biggest security events in recent history. Webmasters, server administrators, and security professionals around the globe mobilized their forces and began a round of patching and updating that hadn’t been seen in years. According to Ivan Ristic at Qualys, we’ve been incredibly successful, and only about 1% of all web servers on the Internet are still vulnerable. So that’s the end of it, right? 

Not hardly. In fact, we are not even close to the end of our susceptibility to the Heartbleed vulnerability.

Let’s take a look at a concept called the "long tail."  Popularized by Chris Anderson in his book, The Long Tail: Why the Future of Business Is Selling Less of More, the term refers to the trailing edge of items consumers purchase that individually don’t amount to much, but as an aggregate might be equal to the top few selling items. The long tail is what makes sites like Etsy work, because even if each merchant only sells a few items, when you add up all the items sold by all the merchants, they mean a lot of money flowing across the site, making it economically viable for the company to exist.

While the long tail originally referred to the relationship between the popular and less popular items that merchants sell, it’s long since been bastardized to refer to any relationship where there is a large group of primary instances followed by a large, distributed group of instances that fade into obscurity. That pretty much describes the process of patching and updating software in the modern age of software-everywhere and the Internet of Things. The majority of high-profile systems get patched immediately, while the large number of systems that are unmanaged or receive less attention languish and are patched much later, if ever.

Think back for a moment to October 2008. If you’re like me, there’s not a lot that sticks out in your memory immediately. But if you were in security or IT at the time, you’ll remember that month’s Microsoft patches, which included MS08-067, a vulnerability in the Microsoft Server Service that allowed for remote code execution. Panic ensued, and the vulnerability was patched and all was well again. Except it wasn’t.

To this day there are still systems out there that are unpatched, there are still systems that are scanning for vulnerable systems, and there are still penetration testers using this vulnerability to break into companies every day. Attackers of all stripes still know they can find unmanaged systems at companies around the globe and can use these systems as jumping off points to get into the rest of the corporate network. Even though most of the systems that supported MS08-067 have long been decommissioned, there’s a long tail of systems still limping along in obscure corners of our networks that allow this vulnerability to be exploited on nearly a daily basis.

Heartbleed will be much worse than MS08-067 could have been, when you consider its long tail. At least the Microsoft vulnerability only affected Windows systems; Heartbleed affects OpenSSL, which is used in such a diverse range of systems and devices that no one actually knows in how many places it resides. The websites and blogs were obvious, and few were surprised to find that OpenSSL was part of the Android operating system or that it was part of many VPN software suites. But what about all the less obvious places, like home routers, CCTV systems, HVAC control systems? How many people realize that the systems that control our electricity and water, the SCADA systems, also used OpenSSL as part of their software? Probably not as many as should know.

Does this mean we should be dumping OpenSSL from systems? NO! OpenSSL serves a purpose: It’s a way for us to have a common library for the encryption of Internet traffic, and a vulnerability like Heartbleed is a problem any software could have. In fact, the amount of attention that Heartbleed has brought to OpenSSL means that in the short term we’ll be seeing many more vulnerabilities exposed, but as time goes by and the vulnerabilities are exposed, patched, and mitigated, OpenSSL will emerge from the process much more secure than it’s been in the past. 

What we should be concentrating  on at this time is understanding everywhere that OpenSSL is operating within our sphere of influences. It’s not an easy task, since every device within your network could be using code from OpenSSL if it has an administrative interface. (That’s assuming that you know about the device in the first place.) The long tail of Heartbleed will be with us for years as we find all the nooks and crannies that hold OpenSSL code and patch them or take them offline.

As Dave Lewis highlighted in his recent article, "Undocumented Vulnerability in Enterprise Security," few of us know about every system on our networks, something that’s only getting more complicated as the Internet of Things becomes less of a buzzword and more of a part of the fabric of our lives. When you think of how many undocumented nodes exist on a network, the long tail of vulnerabilities like Heartbleed only becomes more daunting and harder to deal with.

Martin McKeay is a Senior Security Advocate at Akamai, having joined the company in 2011. As a member of Akamai's Security Intelligence Team, he is responsible for researching security threats, customer education, and industry intelligence. With more than 15 years ... View Full Bio
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Bprince
50%
50%
Bprince,
User Rank: Ninja
6/23/2014 | 12:33:15 AM
unpatched systems
It's unfortunately a reality that many people are going to leave systems unpatched long after a patch is available. I think in that case if there is a breach, unless we are talking about a power plant or something that may have a reasonable excuse, there should be legal liability for the business. I mean really - if you haven't patched the vulnerability at the center of Conficker (MS08-067) by 2014, that's a problem.

BP
Christian Bryant
50%
50%
Christian Bryant,
User Rank: Ninja
6/16/2014 | 3:15:48 PM
Re: Whose Responsibility?
@ Marilyn Cohodas @ mckeay

When we look at the stakes and how many are affected how can we not act, regardless the fears of business owners and private organizations?  And, I can't see the government being involved directly; look at BP in the Southern US.  A travesty on a scale unforgivable - one of many incidents where a protective agency like the EPA who should have more power doesn't, or where someone is fearful of losing "big money" for the US.  I won't speak to criminal negligence here.  So if we created a CEPA (Cyber Environmental Protection Agency)?  Can't see that going well if it was government-run.  But rather than simply put the task in the hands of hacktivists who would be risking legal actions (as many do with the publishing of exploits every day), we'd need at least government sign-off and local law enforcement sign-off; highly unlikely, I know.  Something needs to be done.  I hate open questions... 
Marilyn Cohodas
50%
50%
Marilyn Cohodas,
User Rank: Strategist
6/16/2014 | 8:46:21 AM
Re: Whose Responsibility?
Have to agree with Martin that a government agency isn't the answer but the idea of an intelligence-sharing effort between government and the private sector is definitely worth considering, along the lines of what the retail industry is attempting in the wake of Target and other breaches, as well as similar endeavors in financial services & defense. 

 


i
mckeay
50%
50%
mckeay,
User Rank: Apprentice
6/16/2014 | 5:32:47 AM
Re: Whose Responsibility?
While the idea of having a governmental agency perform environmental cleanup of old technologies and unpatched servers seems interesting, I am not sure it'd be any more well received than the real Environmental Protection agency in the US.

Let's think about this for a moment.  Right now governmental agencies who do security are already thought of in the dimmest light possible and the NSA has been accused of hoarding 0-Day vulnerabilities for their own use.  Do you think an agency that is charged with scanning for unpatched systems wouldn't be tempted to do the same or encouraged to do so by intel gathering organizations? 

Another thing to think of is that businesses aren't going to appreciate having someone from the outside doing testing on their systems and telling them they have to spend resources on fixing the problem.  Most businesses know they have systems that aren't getting patched already and rather than fixing the problem, they'd rather ignore it and take the chance it won't get discovered.  

One final point is that it's no longer just high tech businesses that this applies to.  Every business has some aspect of high tech to it, some connection to the Internet and some level of dependence on that technology.  Should the businesses that supply this technology to them be more concerned with patching and updating?  Of course, but until the businesses relying on the technology demand it and there's money relying on those patches, nothing's going to change.

The idea of a not-for-profit organization that employs scanning technology and notifies businesses of their exposed vulnerabilities is interesting.  But given the current legal environment in most western countries, this would be a risky endeavor at best, since even scanning a company can be met with legal action if you annoy the wrong person.  There's a lot of unintended consequences of the current legal system and I'm not sure there wouldn't be the same if we changed the laws to allow for scanning and notification.  
Christian Bryant
100%
0%
Christian Bryant,
User Rank: Ninja
6/13/2014 | 1:55:52 PM
Whose Responsibility?
 

Excellent point.  But it begs the question: Who is responsible?  See, for all those thousands of systems out there that make up the long tail, should it only be cyber criminals scanning the length of it until they find vulnerable systems?  The obvious answer should be the IT staff who own the systems need to be doing that, too, but as history shows, they aren't all owning up to their responsibilities.  So who?

I'd always imagined there would be an organization of white hatters who, with documented, iron-clad passports to hack from law enforcement and government agencies, would work day in and out doing exactly what the black hatters are doing except, once they find a vulnerable system, they immediately lock it down, or reach out to the owners and get them to do their job.

If that sounds like a superhero comic book more than reality, take account of the trillions of American dollars (and then add in every country on top of that subject to cyber criminal activities) lost to cyber crime and ask whether it isn't worth it to invest in a group like this that essentially mimics a cyber criminal crew up to the last action, then takes one more vulnerable system out of the equation.

The high tech industries have a responsibility to the average citizen to provide assurances like this, just as our government provides law enforcement and military, because high tech is where this threat comes from.  Software giants have established an electronic frontier that is basically pushed upon the everyday person, whether they want it or not, yet takes little global responsibility over the security and restoration of those lives harmed through the necessity of high tech in today's society.

How about the next few million dollars invested in tech go to forming a team like this that can make a real nation-wide difference, not for profit, simply to give back to the millions of people hurt by an ecosystem they may not even have wanted in their lives.  For me, someone that eats, breathes and dreams tech, I think that is the least we can do; when the power goes down, it's those people we'll need to be friends with, not silicon billionaires.
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Dark Reading, September 16, 2014
Malicious software is morphing to be more targeted, stealthy, and destructive. Are you prepared to stop it?
Flash Poll
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2006-1318
Published: 2014-09-19
Microsoft Office 2003 SP1 and SP2, Office XP SP3, Office 2000 SP3, Office 2004 for Mac, and Office X for Mac do not properly parse record lengths, which allows remote attackers to execute arbitrary code via a malformed control in an Office document, aka "Microsoft Office Control Vulnerability."

CVE-2012-2588
Published: 2014-09-19
Multiple cross-site scripting (XSS) vulnerabilities in MailEnable Enterprise 6.5 allow remote attackers to inject arbitrary web script or HTML via the (1) From, (2) To, or (3) Subject header or (4) body in an SMTP e-mail message.

CVE-2012-6659
Published: 2014-09-19
Cross-site scripting (XSS) vulnerability in the admin interface in Phorum before 5.2.19 allows remote attackers to inject arbitrary web script or HTML via a crafted URL.

CVE-2014-1391
Published: 2014-09-19
QT Media Foundation in Apple OS X before 10.9.5 allows remote attackers to execute arbitrary code or cause a denial of service (memory corruption and application crash) via a crafted movie file with RLE encoding.

CVE-2014-3614
Published: 2014-09-19
Unspecified vulnerability in PowerDNS Recursor (aka pdns_recursor) 3.6.x before 3.6.1 allows remote attackers to cause a denial of service (crash) via an unknown sequence of malformed packets.

Best of the Web
Dark Reading Radio