5 Steps To Stop A Snowden Scenario
The NSA leaks by a systems administrator have forced enterprises to rethink their risks of an insider leak and their privileged users' access
No organization wants to believe one of its own could go rogue. But after being blindsided by the Edward Snowden leaks, even the highly secretive National Security Agency has been forced to overhaul its procedures to lock down just what its most privileged users can access and do with sensitive information.
|Click here for more articles from Dark Reading.|
Click here to register to attend Interop.
More Security Insights
- Forrester Study: The Total Economic Impact of VMware View
- Securing Executives and Highly Sensitive Documents of Corporations Globally
- Top Big Data Security Tips and Ultimate Protection for Enterprise Data
- How to Improve Customer Analytics: Best Practices
As the Snowden case demonstrated, it's not easy to detect an insider threat. Some 54 percent of IT decision-makers say it's harder to catch insider threats today than it was in 2011, and nearly half acknowledge that their organizations are vulnerable to a rogue insider attack, according to a new report (PDF) published today by Vormetric and co-authored by the Enterprise Strategy Group. It's a matter of more users accessing the network, including contractors, and the loss of control over data due to cloud computing, for example, according to the findings.
Privileged users a la Snowden are their biggest concern, with 63 percent saying their organizations are ripe for abuse by those users; some 45 percent of IT decision-makers say they've changed their views on insider threats in the wake of reports on Snowden's leaks to the press.
"Up until this [Snowden] case, it was all about providing support, getting customers supported, and getting data to the right people. It was not about analyzing [the admin's] access," says Bob Bigman, former CISO of the CIA. "To provide support, Snowden was given more access than he should have been given ... What exacerbated it was that not only did he have access to his systems there, but systems he had privileges on that were trusted to other systems within NSA. That enabled him to jump [among] various systems ... It was all done under the banner of customer support."
NSA officials told National Public Radio that as sys admin, Snowden had access to an NSA file-sharing location on the agency's intranet in order to move sensitive documents to secure places on the network. The NSA didn't catch him copying the files, however, and the agency now has implemented a "two-person rule" for access so a lone wolf can't leak sensitive information like Snowden did.
"It's human nature to hope for the best. But hope is not a security plan," Bigman says.
Big-name companies are putting in place new insider threat prevention programs. Dell, for example, which was Snowden's employer prior to his gig at NSA contractor Booz Allen, coincidentally is beginning the rollout of its new insider threat prevention program, which has been in the works for the past two years. Dell calls the initiative its "knowledge assurance program."
John McClurg, Dell's CSO, says "insider risk" is a more appropriate term than "insider threat."
"Not all insiders pose a threat. Many of them carry a vulnerability with them ... that a threat vector might exploit, and some might become the threat vector," says McClurg, who notes that avoiding false positives that misidentify an insider as malicious when instead his or her credentials are stolen is important.
"You do an analysis of what gave way to the false positive," says McClurg, who declined to comment on the Snowden case.
And like any advanced cyberattack, there's no way to stop a determined rogue insider from stealing or leaking information -- it's all about minimizing the damage. "You put in layers that slow them down. Have an active detection capability in place," says Larry Brock, former CISO at DuPont and president of Brock Cyber Consulting. "You have time to stop them in their tracks before they do damage," says Brock, who previously worked for the NSA.
[A determined user or contractor hell-bent on leaking data can't be stopped, but businesses should revisit their user access policies and protections. See NSA Leak Ushers In New Era Of The Insider Threat.]
Rob Rachwald, senior director of market research at FireEye -- who will present at Interop some best practices being adopted by enterprises to prevent, detect, and catch early any insider misbehavior -- says the sys admin problem is really nothing new.
"I remember at one of my first jobs, a sys admin was busted for reading everybody's email down there in the server room in the late '90s," Rachwald says. "It's been going on forever. The big problem is, sys admins are always being 'defined' by big companies like Microsoft and Oracle. They've put some security in [their software], but the fundamental problem is that they are not security companies."
Here are some tips culled from Rachwald's research, as well as other security experts on how to trip up or catch a possible rogue insider in the act:
1. Work closely with the business side to ID critical information to protect -- and loop in the senior execs.
Start small and think big, Rachwald says. "Quite often, security people come in with a little boil-the-ocean approach," he says. Work with the various lines of business to pinpoint where the crown jewels reside and lock them down, he says.
"We found that from an alignment standpoint, good security people have made the problem very personal," Rachwald says of research he conducted. "So they worked with the lines of business to understand the impact of what could go wrong: If this got breached, what would it do to your competitive situation or brand? They're asking lots of those questions to make it personal."
Dell's McClurg notes that the first phase of Dell's program was identifying where its critical data sits, ensuring it's categorized or labeled, for instance. "The first phase of most everyone you talk to is, 'What is the status, the environment?' And call out those opportunities you need to improve, [such as] how you grapple with historical data points," some of which could reside in access control systems, for example, he says.
Brock, meanwhile, says he has seen companies assign a senior, non-IT person as a bridge to work closely with the CEO and security team to review security projects and progress. "Some organizations are reluctant to take this up to the senior leadership in the company. I believe that's crucial. The CEO and [his or her] team really needs to understand these threats," he says.
2. Team with your legal and human resources departments.
Make it as difficult as possible for an insider to go rogue by tying user policies in with the legal department and HR, Rachwald says.
One company in his study created processes that would trigger the legal department to step in. "If the [employee] were off-boarded [from the company's systems], they'd give a list of things he had access to, apps. If any of this came up with the competition, it would be under scrutiny," Rachwald says.
Have HR inform employees on the consequences of a competitor getting stolen information, for instance. "A lot of companies are working closely with HR not to just implement policies around insider threat, but also training" on the reason behind it, he says.
3. Decentralize your security department model.
Some large enterprises have embedded security staffers within the various lines of business so they forge closer ties with them and better understand their data security needs, FireEye's Rachwald says.
"They could understand the line of business and work very carefully with the owners on what the important data is, what the important processes are, who the data owners are, and put processes in place," he says. "There's a big benefit when [security] people understand that business extremely well."
The catch, of course, is such a model isn't realistic for resource-strapped smaller companies, which are stuck with a more centralized approach.
Next page: Schooling and revoking privileges