Like many of you, I was there at the birth of the cloud. I watched the evolution in the private sector and have supported adoption in the public sector, which is taking much longer, due to the time it takes government to innovate and implement new technology. Few government CIOs have been able to overcome this challenge.
Contrast that to the experience in the private sector, where cloud computing has resulted in in cost savings and increased efficiency for many industries. Early adopters’ ability to change and adapt quickly to the cloud fueled their success but that same process has stymied many large government IT organizations.
Opportunities and Blind Spots
For government, there is a great opportunity in having a common architecture that propels intelligence integration and big data analytics. It starts with the flow of intelligence from collectors, people, satellites and sensors into the cloud(s). Authorized personnel with the right credentials can check out the applications from a library of applications to interrogate, analyze and enhance the data. Products can then be developed, hosted and consumed, with usage tracked for value. The integration that happens in the cloud is a tremendous value. But as this shift takes place, governments often find themselves in the middle of a balancing act trying to manage both cloud and legacy systems that remain.
Years ago, when my public sector organization was preparing for an external assessment, I wanted to ensure the two large enterprises received a minimum rating of excellent based on the criteria that had been established for the review. My director also wanted to know the outcome in advance of the inspection. I’ll call the two enterprises, Enterprise A and Enterprise B. As I pushed for compliance on Enterprise A the metrics improved, but at the same time, the metrics for Enterprise B dropped. I pushed for focus on Enterprise B, and voila, Enterprise B improved, however Enterprise A metrics worsened. The reason? The same manpower was responsible for patching both enterprises and could only keep up with one at a time. At the time this was a very manual process. Lesson learned: managing risk, the continuous evaluation of the security posture of systems and the cloud, often use the same resources.
Vulnerabilities to the Persistent Threat
As more government organizations migrate to the cloud, the balancing act of protecting, monitoring and testing thousands of legacy systems will increase, along with the advanced persistent threats from the bad guys -- nation-states, cyber criminals, and hacktivists. The threat is advanced, not because of technology but because of the way hackers perform reconnaissance, collect intelligence, and persistently go after very specific targets. Any organization touching the internet, including demilitarized zones (DMZs) that have not been compromised, will likely be compromised. Only the naïve believe they are impenetrable.
The Problem with Signature-based Technologies.
Legacy systems such as antivirus, firewalls, and intrusion detection systems face far more advanced threats now than when they were originally authorized to operate in the 2008 to 2010 time period. In 2008, these technologies were solid solutions for defense-in-depth strategies. However, the benefits no longer justify the cost, and public sector organizations will need to make trade-offs for a more modern platform that uses behavioral and heuristic signatures. Solutions for the threats faced today must also incorporate intelligence.
Budget and operational challenges
It’ no secret that government does not move quickly to implement new technology. The budget lifecycle is a long arduous process, typically 18 to 24 months. But in the real world, threats advance much faster, with little regard to the interval between a budget proposal, justification and its implementation. As a result, government security executives are put in the position of having to obtain approval for silos of capabilities that individually appear great at a given moment of time, but are hard and expensive to integrate and execute.
What’s more, especially for Security Operations Centers, (SOCs), the public sector needs a faster refresh rate to keep up with persistent threats because standard SOC technologies (IDS, AV, and firewalls) do not protect against new threats like 0-days or persistent threat actors. When you combine the issues of old or signature-based technologies and the unfortunate possibility that legacy system protection could be overlooked, the outcome could be devastating. Here are 7 critical strategies and practices to avoid a worst-case scenario.
Follow the National Institute of Standards and Technology (NIST) guidance for continuously monitoring the security controls of the systems in operation.
- Executive leadership must recognize that legacy systems and government clouds require robust and evolving protection. Measuring the security of both is critical.
- Deploying hosting security services in both the enterprise and cloud will make it easier for legacy systems to inherit the services and ensure that continuous monitoring is performed centrally.
- Within the SOC, develop a plan to protect legacy systems with a platform that can integrate signature-less (heuristic, behavioral) and signature tools.
- Allocate your resources wisely. Make sure you have both the manpower and technology, to cover both existing systems and the cloud.
- Analyze the effectiveness of your current SOC capabilities against the cost. Make sure you are getting the value you need and make trade-offs when necessary.
- Act with a sense of urgency and purpose. An 18 to 24 month budget cycle is too long to deliver adequate security capabilities against today’s threats.
The cloud offers tremendous value for government, but only if organizations adopt measures that protect legacy systems and ensure that security solutions can defend against today’s advanced threats.