In a recent New York Times opinion piece, National Security Agency General Counsel Glenn Gerstell described how traditional national security systems, developed after World War II, dependably gave early warning of foreign military developments, such as firing missiles and the movement of tanks, aircraft, ships, and submarines. Fusing telemetry data with advanced surveillance technology gave us a level of confidence that we were safe and could manage contingencies. However, Gerstell makes a compelling argument that that is no longer the case. The technology revolution has "upended" our national security infrastructure and institutions, according to Gerstell.
Gerstell is not alone in his thinking. Joseph Hill, the acting director of National Intelligence, also believes cyberspace is our biggest vulnerability. Outside of government and the military, a recent survey of America’s businesses of all sizes, conducted by Travelers Companies, found that cybersecurity was respondents' No. 1 concern.
As an enterprise leader, it is worth recalling why our post-World War II strategy was successful: We integrated what we knew about foreign military developments in real time. Unfortunately, today we are too focused on finding a better mousetrap and not integrating what we know.
Time to Stop Playing Security Whack-a-Mole
I recently spoke with a CISO about how he won approval to procure 15 tools to bolster security operations but heard little about fusing output datasets to create a real-time understanding of suspicious activity across the enterprise.
The CISO's focus was on more analysts, who are hard to find and burn out quickly from a daily whack-a-mole game of responding to redundant incidents without correlating them with what they've seen in the past. Companies that can afford one of everything acknowledge this strategy generates too much noise. The combination of too many tools, redundant threat feeds, and analyst burnout leads companies to spend more and become less secure. This strategy at-scale becomes even more inefficient and costly when whole sectors and industries choose to "tool up" rather than take a disciplined approach of managing and fusing cyber intelligence. We must reset our strategy on how best to secure ourselves rather than search for a better mousetrap (or buy more of them). We must fuse the tools that we already have.
How to Leverage What You Have
Start by taking a page from how security teams handle traditional security threats to weave together a system of ecosystems in the cloud. There are typically three stages.
Stage 1. Companies leveraging the cloud fuse alerts from their own systems with their external intelligence providers. This requires companies to easily integrate the output from their existing tech stack (SIEM, EDR, case management, orchestration) with input from internal intel sources without disrupting analyst workflow.
Stage 2. Layer in security-related activity beyond security operations to fraud and abuse. Each leads to security problems within the enterprise and for companies down-range. For example, account takeovers (ATOs) can not only be used for malicious activity inside a company but can also lead to adversaries misusing an account to attack others.
Stage 3. Reach out to other companies to exchange information about your common security and fraud challenges. This is where the cloud holds significant advantages as companies choose partners based on a variety of needs, ranging from securing supply chains to battling specific threats within and between sectors. The cloud allows both the public and private sector to work with each other. Rather than just sharing information, companies can define use cases and have the means to quickly and seamlessly exchange and analyze data. The cloud also enables companies to derive insights and trends within their own company as well as how they compare with others.
A New Model: LA CyberLab
Hundreds of companies are already changing course to a cloud-based model to fuse their internal data with external threat information. They ingest and enrich cyber intel from a variety of tools ranging from security event management systems to endpoint detection and case management systems to third-party intelligence. A successful platform combines several capabilities: ingesting and normalizing structured and unstructured data, permissions and access management, fusing and enriching data, and redacting sensitive and proprietary information. A platform must also be extensible so that companies can fuse data between separate security-related operations such as security operations centers, fraud, and internal investigations within companies and between companies.
In September, Los Angeles Mayor Eric Garcetti launched the LA CyberLab, a TruStar customer, to fuse data from the public and private sector, local municipalities, and consumers. The exchange of suspicious event data will speed investigations, identify trends, and ultimately improve security. It has backing from the mayor, the Department of Homeland Security, IBM, innovative technology platforms, as well as some of Los Angeles' biggest business leaders.
LA's model can be replicated, creating new ecosystems of fused data involving suspicious events. Leaders recognized that threat actors commodify and replicate attacks across sectors and local, state, and federal government. Sector-based sharing models like ISACs and ISAOs will remain important, but LA's model is different. The potential power of fusion is immense when we start to think about security in terms of interconnected systems instead of siloing data between tools and sectors. We must converge our cyber intelligence systems in order to achieve full visibility of the attack landscape. We should look to LA as a model of where we must go.
Check out The Edge, Dark Reading's new section for features, threat data, and in-depth perspectives. Today's top story: "In the Market for a MSSP? Ask These Questions First"