Consolidation and automation are among the strategies for balancing security complexity and capability.

Martin Roesch, CEO, Netography

February 1, 2022

4 Min Read
Complex maze being solved
Source: Aleksandr Elesin via Alamy Stock Photo

Years ago, I decided to upgrade my home surround-sound system to the (then) state of the art. It was a game changer, with high-fidelity wireless speakers that could be partitioned to play different music in separate parts of the house. The system worked flawlessly for the first couple of years, but, as time went by and new features were rolled out, the system's performance began to slowly degrade. Playlists wouldn't load, the music would cut out, older speakers wouldn't work with the new app … I still love my sound system, but I wish it worked as well as it did when I first got it. 

Anyone who has spent time in the enterprise information security trenches can no doubt relate to this story: We want the latest and greatest capabilities, but, at the end of the day, we just want our systems to work. As the renowned security guru Bruce Schneier wrote, "Complexity is the worst enemy of security — and our systems are getting more complex all the time."

In the domain of security, we have become accustomed — if not resigned — to a perpetual upgrade cycle. This often comes at the expense of complexity — sometimes to the point of rendering some advancements moot. The scope of the issue is enormous when you consider that there are currently more than 3,500 security vendors in the technology industry and the average enterprise has an average of 45 different security tools under management.

Why Incremental Capabilities Beget Mountains of Complexity
Security has grown so complex, in large part, by design. For the better part of the past two decades, the prevailing wisdom has been to adopt a layered or "defense in depth" approach to security. The logic is simple: Relying on one tool to perform a certain function is a potential single point of failure waiting to happen; layering multiple tools serves as a backstop in the event that one tool misses a threat or fails in some way.

It's no wonder, then, that security leaders have been beefing up their tool arsenals. This may make them feel like they have protected their organizations, but more products and more capabilities do not necessarily equal better security.

The reality is that no major data breach has ever come as the result of not having enough tools in place. In fact, many of the most devastating data breaches during the past decade were caused, at least in part, by complexity: The cacophony of noise generated from dozens of separate systems offers the perfect cover for threat actors, enabling them to remain undetected inside networks for months (and sometimes even years).

3 Strategies to Close the Security Effectiveness Gap
To balance security and complexity, organizations must close the security effectiveness gap — the widening delta between new capabilities introduced and the complexity these capabilities produce.

Here are three strategies for closing this gap and building a more responsive security practice — one that can filter out important threat signals amid a sea of noise.

1. Don't invest in a new tool unless it's truly interoperable: There's been a lot of positive momentum during the past couple of years in terms of making security systems smarter and more interoperable. But just because a vendor says its tool is easy to integrate doesn't mean that it is. Before introducing a new tool, determine what other systems the tool needs to interoperate with, and seek out tools that generate prescriptive insights rather than batches of data that need to be processed and analyzed.

2. Take stock of your existing capabilities and consolidate ruthlessly: Another way to keep complexity in check is to focus on fewer devices with more capabilities, as well as investing in platforms that can talk to complementary pieces of hardware and software. The more hardware and software components in a network, the more brittle and difficult-to-interpret interdependencies will be in place.

3. Implement automation to attack the analytics complexity problem: Every security team struggles with base rate fallacy and false-positive issues. The base rate fallacy assumes that all events are created equal, but we intuitively understand that they're not. Machines can help mitigate this complexity by providing the context needed to determine how much we should care about a particular event and to do so in an automated fashion.

Conclusion
As an industry, we need to fundamentally shift the way we think about security and the role of tools in general. We must also remember that no matter how robust tools may be, it takes people to effectively use and make sense of them.

About the Author(s)

Martin Roesch

CEO, Netography

With over 25 years of experience in information security and embedded systems engineering, Martin Roesch was one of the first entrepreneurs to successfully commercialize open source software, in addition to creating the global standard for describing and detecting network-based attacks. 

In 2001, he founded Sourcefire, serving as CEO/CTO, until its 2013 acquisition by Cisco for $2.7 billion. He then joined Cisco and led the Security Business Group as Chief Architect. Marty is the original author and lead developer of the Snort Intrusion Detection and Prevention System that formed the foundation for the Sourcefire product suit. 

Today, Martin is the CEO of Netography, the security company for the atomized network, which recently secured $45 million in Series A funding for the company.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights