There's a strange paradox about business today. Technology, which has long been its most powerful enabler and accelerant, has emerged as business's biggest, but largely invisible, threat.
I'm not talking about the latest apocalyptic fantasy about artificial intelligence, but rather the exploding by-product of business in the age of cloud computing and the Internet of Things (IoT): data. As IBM CEO Ginni Rometty recently declared, "Data is the world's new natural resource. It's the new basis of competitive advantage and it's transforming every profession and industry." Yet if all that is true, she argued, "then cybercrime, by definition, is the greatest threat to every profession, every industry, every company in the world."
It's a rational argument. Global cybercrime is predicted to cost $6 trillion annually by 2021, but it's not as existentially scary as Rometty makes it seem. Because almost every function of business has been digitized, today's cloud-powered companies are operating at incredible speed — and will only keep accelerating. What's more, billions of new IoT-enabled devices are baked into just about every facet of industrial technology, from power grids and wind turbines to break-room snack machines — all slinging data around the clock. We have unprecedented levels of security risk thanks to a rapidly expanding attack surface that now faces virtually every company. No wonder it takes over six months today for most companies to even detect a data breach. And, as we've seen with the latest Uber breach, businesses may take months to a year to disclose a breach to the public even after it is detected.
What companies lack today is accurate, real-time visibility of the dynamic attack surface. Traditional security tools were built for long-gone fixtures such as client-server technology, on-premises data centers, and linear software development cycles. Modern IT thinks in terms of minutes when it comes to release cycles. (In just two years, according to a recent study by Cisco, the number of third-party cloud applications in business has grown by a factor of 10 and more than 25% were deemed to be high risk.)
Additionally, a worst-case mindset tends to cloud more pragmatic executive decision-making. Companies often fixate on macro events like nation-state attacks when they are far more likely to be breached by a random malware attack like WannaCry. Companies too often don't take the simple measures to protect themselves as much as they should against the more likely threats.
How can executives shift into smarter, more holistic management of cyber-risk? It starts with focusing on the widening gap between threats and risks that are currently known (and thus under-represented) and true cyber exposure. Scanning the network for vulnerabilities or deploying multiple tools against the "threat of the week" is a one-size-fits-all approach that no longer aligns with reality. Mobile and IoT devices often operate under the radar for such security tools, as do public cloud resources, software-as-a-service applications, and industrial control systems.
In order for businesses to effectively manage their cyber exposure, here's what I recommend:
Every aspect of business has risks that can be managed — and managed well. Cyber exposure is no different. Emerging technologies that provide a specific focus on a targeted piece of the attack surface (for example, operational technology or open source software), advanced security analytics, and enhanced, cross-functional operational workflow can help companies reduce their exposure and give business leaders greater confidence in managing risk based on quantitative and actionable measurements. A scanner can identify a vulnerability, but a true understanding of cyber exposure will analyze the seriousness of that risk, what might happen if you choose to accept it, and how severe the various possible outcomes of a breach might be.