Federal Agencies Wrestle With Cybersecurity's Harsh Realities
Sophistication of attacks, shortage of resources lead agency IT chiefs to focus less on perfect security -- and more on risk management
WASHINGTON, D.C. -- FedScoop Cybersecurity Leadership Summit -- In a perfect world, U.S. federal agencies would be able to prevent all attacks -- and identify those who launch them. In a perfect world, agencies would comply with all security regulations and provide open access to public information while tightly securing all data that might be important to national security.
There's just one problem: The world isn't perfect.
That was the message here today as top IT executives of several federal agencies -- as well as federal business unit executives of some of the industry's largest vendors -- met in a panel discussion on government cybersecurity programs and the challenges they face. In essence, all of the executives described their efforts to deal with security's realities in a practical fashion, rather than attempting to build impenetrable perimeters.
"The fact is that there is no point of attack that is too obscure to be found, and there is no security perimeter that can't be breached," said Tom Quillin, director of security alliances at Intel. "We need tools and strategies to protect [systems] from attack -- but we also need ways to detect attacks and recover from those attacks when they penetrate those defenses."
Ron Ross, computer scientist and a leading voice on security at the National Institute of Standards and Technology (NIST), agreed. "Even in the case of nuclear attacks, we have 20 or 30 minutes to respond, but cyberattacks move at the speed of light -- there's no time to respond. A defense that's built entirely around boundary protection won't work. Like an army, we have to be able to absorb an attack -- and still be able to move on to complete the mission."
Even the question of what constitutes an act of cyberwar -- and the identity of those who launch it -- remain open questions for the federal government, said Lt. General Jeffrey Sorenson, CIO of the U.S. Army. "How do we know who's conducting an attack? Are we really under attack, or is it just a nuisance? To be honest, the metrics we use to measure these questions are still very fluid."
Even simple compliance with federal government IT policies is problematic, according to Sanjeev (Sonny) Bhagowalia, CIO of the Department of the Interior. "One thing I have learned in government -- we have a lot of regulations," he quipped. "We don't have enough budget to comply with everything. We're also dealing with a major shift in policy -- after the 9/11 attacks, we were in the mode of, 'Secure everything and share what you must.' Now with the open government initiative, it's, 'Provide access to everything and protect what you must.' It's a big change for us, especially in security."
Then there's the problem of identifying and locating sensitive data, noted John Bordwine, CTO of public sector at Symantec. "We have reached the problem of what I call '30 instances of data,'" he said. "If a few of us share an email, then that email appears on our hard drives, in our sent mail folders, on our backup devices, and on the company's backup system. It appears on our smartphones and PDAs. In some cases, the data might end up with a contractor or in a cloud infrastructure. Finding that data and securing it is not easy."
With so many demands on security -- and only limited resources to achieve them -- federal agencies are rapidly reaching the conclusion that the holy grail of secure perimeters must be put aside in favor of a more practical, risk-oriented set of security priorities, the speakers said. The key: identify the highest risks -- both in terms of data sensitivity and likelihood of attack -- and secure that data first.
"We're looking at risk-based approaches, rather than security perfection," Bhagowalia said. "It's more about information assurance than about security. It's more about continuous monitoring than about compliance."
NIST's Ross agreed. "We've developed a structure for enterprise-wide risk management," he said. "How do you monitor risk over time? How much risk can you tolerate? Once you've answered these questions, then you can set up your missions and business processes."
While agencies internally are focusing more on security reality and less on absolutes, that same mentality could be applied to the question of identifying the source of an external cyberattack, noted Lewis Shepherd, director of Microsoft's Institute for Advanced Technology in Government.
"Attribution may not always be possible, but there is the possibility of developing a 'sliding scale' for situations where that's the case," Shepherd said. "We do this in our courts all the time -- we make decisions based on 'beyond a reasonable doubt' or 'a preponderance of evidence.' We measure the probability of culpability and the harm or potential harm that was caused, and we render a decision."
But whether the problem is compliance or all-out cyberwar, the security challenges for federal agencies are not likely to get smaller anytime soon, Ross said. "The way we're handling them now, we can pretty much guarantee that the attacks will continue," he said. "We're seeing exponential growth in the instance of malware. The threat space is increasing a lot. This problem is not going away."
Have a comment on this story? Please click "Discuss" below. If you'd like to contact Dark Reading's editors directly, send us a message.
About the Author
You May Also Like
DevSecOps/AWS
Oct 17, 2024Social Engineering: New Tricks, New Threats, New Defenses
Oct 23, 202410 Emerging Vulnerabilities Every Enterprise Should Know
Oct 30, 2024Simplify Data Security with Automation
Oct 31, 2024