On March 2, the White House officially announced its National Cybersecurity Strategy (NCS). The policy document addresses a wide range of issues centered around five pillars. From the need to protect critical infrastructure to fighting ransomware gangs, this document set admirable goals for the future of cybersecurity in the United States.
The NCS focuses on outcomes: stronger international collaboration, greater responsibility in the private sector, and increased resilience. However, the document is light on processes and specifics. In order for these policies to succeed in practice, the government must shift its focus and reconsider how it thinks about securing our nation's digital and physical assets. In order to realize the vision set out in the NCS, we must focus less on outcomes and more on processes. Comprehensive cybersecurity will require organizations to consider how to secure assets at the application and data level, rather than just focusing on cloud and on-premises infrastructure. Here's why.
Putting the Onus on Corporations
The most notable aspect of the NCS is its focus on a new corporate burden. In the fact sheet released together with the complete strategy, the White House argues, "We must rebalance the responsibility to defend cyberspace by shifting the burden for cybersecurity away from individuals, small businesses, and local governments, and onto the organizations that are most capable and best positioned to reduce risks for all of us." This principle points directly to our nation's technology companies — the organizations driving innovation and digitalization.
Again, this is a sensible and laudable strategy. However, the White House's emphasis on corporate responsibility belies the fact that many organizations don't know how to secure their most important assets — they may not even know where to begin. Recent market research on data security finds that the industry is struggling even to meet basic requirements. We have a long way to go, and we'll need leadership and guidance from the federal government in order to help these companies reach our collective goal.
Expanding Surfaces and Tangled Interdependencies
A decade ago, organizations faced a fixed challenge when securing their assets. Their most valuable resources were under their physical control, so they were able to establish a perimeter and limit access to authorized users.
The rise of cloud computing — and the explosion of applications and microservices that came with it — completely upended this approach to security. Businesses must now use cloud-based software in order to remain agile and competitive. However, cloud-based applications and application programming interfaces (APIs) have dramatically expanded the potential attack surfaces for bad actors. As businesses add new APIs and applications, their assets and architectures begin to drift, and existing security approaches fail to keep pace with evolving vulnerabilities. In 2021, an unauthenticated API allowed a bad actor to download personal data from thousands of Peloton customers.
When the White House emphasizes the need for organizations to be responsible for security, it overlooks the current reality that our organizations are interconnected by a tangled web of applications and microservices. A major enterprise can take every necessary step to protect their own infrastructure, but a weak link among their third-party microservice partners can result in the same negative outcome.
Understanding How We Reach Security Outcomes
Just as businesses have changed the way they secure their assets, they've also changed the way they build those assets. Ten years ago, technology companies would release major new products or updates with fanfare on a predetermined schedule. Today, the rise of continuous integration and continuous delivery (CI/CD) means new features are released constantly or even tested in production. As a result, application code and configurations change 10 times more frequently than cloud infrastructure and the network itself.
The increased tempo of code changes combined with our growing web of interdependencies means that organizations face an impossible task when trying to secure their applications and infrastructure. According to recent Gartner research, the percentage of third-party APIs used in applications will average 30% by 2025. IDC estimates that 40% of companies will be shipping code to production on a daily basis by 2024. Existing security tests focus on specific assets or changes to code, making it easy to miss the dependencies and data flows that could become vulnerable in these hypercomplex architectures.
An effective cybersecurity approach must focus on the applications themselves and the data traveling between them. Instead of zooming in on an organization's infrastructure and adopting a myopic, perimeter-based approach to securing it, we need to zoom out to look at the entire web of applications, APIs, and microservices. Organizations must be able to track how changes to code reverberate throughout their system and detect vulnerabilities that emerge as a result.
A Cybersecurity Strategy Grounded in Reality and Collaboration
The fifth and final pillar of the NCS emphasizes the need for collaboration among international allies, recognizing that we depend on each other for the overall safety of our digital world. As we work to implement the principles of the NCS, we must recognize that these dependencies also exist in the private sector. The White House must be realistic about how our technology systems work and how technology companies are now forever intertwined, for better or for worse.
To truly protect the United States' cyber assets and establish a safe environment for digital innovation, the White House must put application security in production at the heart of its cybersecurity strategy. If we use our resources to address security challenges from 10 years ago, we will fail to address the true threats we face in 2023 — and fall hopelessly behind in our ability to encounter the threats that may emerge over the next 10 years.