Vulnerabilities / Threats
4/25/2014
10:00 AM
Connect Directly
RSS
E-Mail
50%
50%

After Heartbleed, Tech Giants Fund Open Source Security

In the wake of the Heartbleed vulnerability, 12 tech giants -- including Facebook, Google, IBM, and Microsoft -- each pledge $100,000 annually to improve core open source technology such as OpenSSL.

Android Security: 8 Signs Hackers Own Your Smartphone
Android Security: 8 Signs Hackers Own Your Smartphone
(Click image for larger view.)

The Linux Foundation Thursday announced that 12 leading technology firms have each pledged $100,000 per year, for the next three years, to fund open source projects. The new Core Infrastructure Initiative represents the industry's response to the Heartbleed bug found earlier this year in the OpenSSL open source SSL/TLS protocol. The vulnerability highlighted that more than half of the world's Web servers rely on a protocol developed by an open source project that only receives about $2,000 per year in donations, even as the Internet ecosystem has become much more complex, and interoperability requirements have increased.

"There are certain projects that have not received the level of support commensurate with their importance," the Linux Foundation said in a statement. "As we just witnessed with the Heartbleed crisis, too many critical open source software projects are under-funded and under-resourced."

But that's about to change, with the first wave of Core Infrastructure Initiative supporters having now collectively pledged $1.2 million per year through 2016. Those 12 supporters are Amazon Web Services, Cisco, Dell, Facebook, Fujitsu, Google, IBM, Intel, Microsoft, NetApp, RackSpace, and VMware.

The launch of the Core Infrastructure Initiative has been widely lauded. "This is fantastic," Dan Kaminsky, chief scientist at White Ops, says via email, emphasizing that the open source technology that facilitated the rise of so many Internet businesses requires ongoing investment to remain useful, usable, and secure.

"This isn't charity," he says of the initiative. "It's just very wise business."

The effort represents leading technology players agreeing to get proactive when it comes to securing better and improving so many of the different pieces of technology that collectively form what's known as the Internet. "This is not just about the money, but the forum,” Jim Zemlin, the executive director of the Linux Foundation, told the New York Times. "Instead of responding to a crisis retroactively, this is an opportunity to identify crucial open-source projects in advance. Right now, nobody is having that conversation, and it’s an important conversation to have."

The first order of business will be examining OpenSSL, and potentially awarding "fellowship funding for key developers," as well as allocating resources to bolster security, outside reviews, and patch-turnaround speed for the protocol, according to the Linux Foundation. But it emphasized that the overall effort "will not be restricted to security-related issues."

Crucially, the Core Infrastructure Initiative also represents the technology industry putting its money where its mouth is. "There's an actual, stable commitment of money -- critical if there's to be full-time engineers hired to protect this infrastructure," says Kaminsky. Also important, he says, is the choice of a de-politicized nomenclature. "'Core Infrastructure' is a great name that avoids the baggage of 'critical infrastructure' while expressing the importance of attention," he says.

The launch of the initiative now paves the way for more businesses to get involved. "We have said that OpenSSL, an important tool for millions of large organizations, needs more oversight and support," Marc Gaffan, chief business officer at Web application firewall vendor Incapsula, says via email. "We’re happy to see the Linux Foundation step up to support OpenSSL and we look forward to the opportunity to participate in the program."

The Core Infrastructure launch isn't the only information security community change to have been triggered by the discovery of the Heartbleed bug, nor the only effort involved in repairing OpenSSL. In recent weeks, many security researchers have been building related patches, as well as hammering away at OpenSSL to try and identify any further bugs.

OpenBSD founder Theo de Raadt, for one, last week told DarkReading that his group was looking to nuke legacy code and "risky code practices" in OpenSSL, without breaking the code for anyone who's already using it. In particular, the group was eyeing OpenSSL's memory allocator, which de Raadt believes is vulnerable to attack. Based on those efforts, however, de Raadt this week announced that, rather than trying to salvage OpenSSL, the OpenBSD community has instead forked OpenSSL, and is building its own version of the free SSL/TLS protocol, which will be called LibreSSL.

Even so, expect others to continue investing time and energy in improving OpenSSL or making it more functional. Google, for example, earlier this year rolled out a new version of TLS for Chrome browsers, which required creating a new abstraction layer in OpenSSL. The new TLS protocol is designed to work three times as fast on devices that don't have built-in AES hardware acceleration, which includes most smartphones, as well as Google Glass and older PCs.

"This improves user experience, reducing latency and saving battery life by cutting down the amount of time spent encrypting and decrypting data," Elie Bursztein, Google's anti-abuse research lead -- and one of the four coders involved in the project -- said Thursday in a blog post.

NIST's cyber-security framework gives critical-infrastructure operators a new tool to assess readiness. But will operators put this voluntary framework to work? Read the Protecting Critical Infrastructure issue of InformationWeek Government today.

Mathew Schwartz is a freelance writer, editor, and photographer, as well the InformationWeek information security reporter. View Full Bio

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Christian Bryant
50%
50%
Christian Bryant,
User Rank: Ninja
5/22/2014 | 6:02:22 AM
Psychology Change for FOSS Hackers
For decades there has been a combination of scientist and hobbyist hackers in the Free and Open Source (FOSS) community.  On one hand you've had the very formal and high-tech programmers, doing development projects with lifecycles, and on the other a more artistic and experimental effort that includes varying levels of code quality, consistency of function usage/behavior and a variety of security features, from none to ironclad.  In between, lots of solid programmers delivering usable code every day.  Here's the thing: funding isn't everything - in some cases, it's worthless.  Requirements psychology is a huge part of delivering a secure application, whether you're a PhD from MIT or a weekend Python hacker.  In other words, be formal, experimental, hack the code or design the code, but you must still hold to a set of requirements to which the end result is compared, and these days security must be part of your application requirements set.  I keep hearing about time to test and how putting code through more intensive QA in some FOSS projects might prevent the next Google or Facebook from emerging.  And, various cash sums are called out to "throw" at projects like OpenSSL to help make it more secure.  This defeats the very reason there are "free" and "open source" projects out there.  These projects are about community, not salaries; about innovation and giving to society, not about cash flow.  This means that the FOSS developer community as well as the user community need to shift their psychology to include security at every level of the programs they write, from code to executables.  The same amount of care hackers take to write a useful new extension for something like GNU Emacs, for instance, should also be put to the security and quality of the overall program.  Security bugs like Heartbleed are not about project money - it's simply about getting the community to learn about, care about and do something about the code they agreed to support as FOSS advocates and developers.  FOSS gives to society, and that comes with added responsibility.
Marilyn Cohodas
50%
50%
Marilyn Cohodas,
User Rank: Strategist
4/29/2014 | 10:48:41 AM
Re: Drop in the bucket
Thanks for putting that in context, Jon. That's quite a jump from $2000 a year to $1.2 million. It will be interesting to see how much added security that buys a year from now.
JonNLakeland
50%
50%
JonNLakeland,
User Rank: Strategist
4/29/2014 | 10:06:29 AM
Re: Drop in the bucket
It's not much relative to the profits of those business, but compared to the $2,000 annually the project was receiving in donations before, $1.2 million is significant. The budget just went from $167 per month to $100,000 per month.

Honestly I'm impressed that they chose to do as much as they are, given that they don't have responsibility for the software.
Marilyn Cohodas
50%
50%
Marilyn Cohodas,
User Rank: Strategist
4/28/2014 | 3:34:29 PM
Drop in the bucket
 A pledge of $100,000 per year from the likes of Facebook, Google, IBM, and Microsoft etc seems like chump change. Is there something more that the industry should be doing to shore up open source security?
macker490
50%
50%
macker490,
User Rank: Ninja
4/27/2014 | 8:15:07 AM
problem was not funding
the heartbleed error is what we classify as a "data dependency".   this is sometimes a careless error but more often an attitude problem where the programmer asserts: "if you send me good data my program will work fine" -- i.e. "I shouldn't have to check what you send me because it's your responsibility to send me good data"

i hope there are no questions about the lesson in this case.    if you are programming you have to sanitize your inputs.
Robert McDougal
50%
50%
Robert McDougal,
User Rank: Ninja
4/25/2014 | 2:59:16 PM
Re: About time!
You are exactly correct.  It has been (unconfirmed) reported that the NSA has upwards of 1000 employees whose responsibility is solely to exam open source projects for possible vulnerabilities.
Thomas Claburn
50%
50%
Thomas Claburn,
User Rank: Moderator
4/25/2014 | 2:52:20 PM
Re: About time!
The funny thing is that the NSA and other intelligence agencies have probably already conducted audits of this sort, at taxpayer expense, on many open source projects. If only they'd share what we've paid for.
Robert McDougal
50%
50%
Robert McDougal,
User Rank: Ninja
4/25/2014 | 2:11:01 PM
About time!
It is a shame that it has taken this long for this pledge to come through.  OpenSSL is a critical piece in the security of many organizations and applications and it should have been audited long ago.

As someone who has dug through the OpenSSL source code I can tell you that it is a nest of spaghetti code.  There could be backdoors intentionally programmed into the code but without an audit we would never know.

Recently, the Open Crypto Audit Project has raised $80,000 to begin the audit process of the Truecrypt source code.  Phase I of that project has completed and while it did not find any backdoors it did identify several minor issues that could lead to vulnerabilities. 

This is what needs to be done for all open source products that we rely on for security.  Although the code is open source and available to all, no single person has the ability or time to review a project in it's entirety.  Therefore it is important that money and resources are allocated to review the code.
Drew Conry-Murray
50%
50%
Drew Conry-Murray,
User Rank: Ninja
4/25/2014 | 2:09:54 PM
Somebody Else Will Fix It
I'm pleased to see the vendor community step up to fund a project like this. I think the open source community model has demonstrated that it can be robust and effective for producing good software, but Heartbleed also revealed a weakness. In a community model, it's way too easy to assume that somebody else is taking a careful look at the code. If everybody assumes somebody else is doing it, no one is.
Register for Dark Reading Newsletters
White Papers
Flash Poll
Current Issue
Cartoon
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2014-0640
Published: 2014-08-20
EMC RSA Archer GRC Platform 5.x before 5.5 SP1 allows remote authenticated users to bypass intended restrictions on resource access via unspecified vectors.

CVE-2014-0641
Published: 2014-08-20
Cross-site request forgery (CSRF) vulnerability in EMC RSA Archer GRC Platform 5.x before 5.5 SP1 allows remote attackers to hijack the authentication of arbitrary users.

CVE-2014-2505
Published: 2014-08-20
EMC RSA Archer GRC Platform 5.x before 5.5 SP1 allows remote attackers to trigger the download of arbitrary code, and consequently change the product's functionality, via unspecified vectors.

CVE-2014-2511
Published: 2014-08-20
Multiple cross-site scripting (XSS) vulnerabilities in EMC Documentum WebTop before 6.7 SP1 P28 and 6.7 SP2 before P14 allow remote attackers to inject arbitrary web script or HTML via the (1) startat or (2) entryId parameter.

CVE-2014-2515
Published: 2014-08-20
EMC Documentum D2 3.1 before P24, 3.1SP1 before P02, 4.0 before P11, 4.1 before P16, and 4.2 before P05 does not properly restrict tickets provided by D2GetAdminTicketMethod and D2RefreshCacheMethod, which allows remote authenticated users to gain privileges via a request for a superuser ticket.

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
Three interviews on critical embedded systems and security, recorded at Black Hat 2014 in Las Vegas.