Will 2014 be the year of change for the security industry? Not if we continue to approach information security in the same ways we have for the past three decades. No, it’s time to move beyond the hyperbolic claims of next-generation security. To address current threats and to reduce risk, we require empirical data and now generation technology.
Today, right now, we can significantly reduce risk by using big data threat analytics, and by analyzing security products based on empirical data and practical deployment use cases. In this way, organizations can better understand the limits of their current security infrastructure. Here are some examples of where we are and where we need to go.
Dynamic, not static, risk assessment
The way in which we assess true risk and apply security countermeasures has become predictable and static. Security products that are deployed do not differ much across different industry sectors. Additionally, security budgets are cyclical. Strategies are often based on historical information. Risk continues to be measured as a snapshot in time, and this significantly increases the time to threat detection and protection. There is no silver bullet that guarantees 100 percent protection, but moving from static risk assessment to dynamic risk assessment will allow us to begin modeling risk that is variable at any given point in nearly real-time.
Dynamic risk assessment requires us to examine risk from multiple angles by leveraging big data analytics. With the correct approach and key data points, well-known algorithms can be applied across multiple key indicators to accurately predict and forecast threats against an organization. You might consider this a far-fetched claim, but it’s not.
Variable risk: a new way to bake a cake
A few years ago, an executive from a Fortune 500 organization observed, “We have all the ingredients to make a cake, but we lack the ability to bake it.” The comment challenged me to rethink our approach to security and strive for a bold alternative. To do this, it was necessary to move beyond the comfort zone of industry-dictated security best-practices and approaches to reducing risk. We can continue to throw money at various point products that might close temporary risk-gaps resulting from recent breaches. We could, conversely, utilize a variable risk model whereby accurate information across multiple indicators provides the data necessary to purchase and deploy security solutions that significantly reduce risk.
In this model, we use multiple top-level indicators to establish a variable-risk score. This requires some work, but a security net with gaps is ineffectual. With our new model we can significantly reduce the gaps with accurate information plugged into a new risk equation that offers a pragmatic approach to addressing risk: Attack Surface (Threat Intelligence) + Threat Modeling = Variable Risk.
Today’s security environment is dynamic and complicated. The threat landscape and the attack surface are constantly changing. Every organization will experience patient zero (the first victim, i.e., the endpoint, when an organization is breached). The ability to reduce the time to detection and prevention is crucial in mitigating a breach.
The variable risk model eliminates the signal-to-noise ratio by focusing on what really matters in an environment. To remain competitive and secure in today’s global environment, organizations require a tailored approach specific to their attack surfaces. Waiting for next-generation security products to become the status quo only increases an organization’s chances of becoming a statistic or of discovering too late that it has been one for the past 16 to 18 months.
John Pirc is research vice president at NSS Labs. He is a noted security intelligence and cybercrime expert, an author, and a renowned speaker, with more than 15 years of experience across all areas of security.