If your company produces gadgets that improve cybersecurity, brace yourself. No matter how much we spend on your next great solution, it won't be good enough. There's still one thing we must give more attention, funding, and resources: humans. An organization can implement firewalls, intrusion-detection and intrusion-prevention systems, and artificial intelligence defenses, but they still won't conquer the human factor, the most vulnerable aspect of a cybersecurity plan.
The technological revolution has introduced a plethora of advanced solutions to help identify and stop intrusions. However, data leaks and breaches persist. Shouldn't all this technology stop attackers from gaining access to our most sensitive data?
Historically, Stuxnet, WannaCry, and the Equifax breach are examples of weaknesses in the flesh-and-bone portion of a security plan. These attacks could have been prevented had it not been for human mistakes.
Stuxnet is the infamous worm (allegedly) authored by a joint US-Israeli coalition, designed to slow the enrichment of uranium by Iran's nuclear program. The worm exploited multiple zero-day flaws in industrial control systems, damaging enrichment centrifuges. So, how could this have been prevented?
The Natanz nuclear facility, where Stuxnet infiltrated, was air-gapped. Somebody had to physically plant the worm. This requires extensive coordination, but personnel in Natanz should have been more alert. Also, Stuxnet was discovered on systems outside of Natanz, and outside of Iran. Somebody likely connected a personal device to the network, then connected their device to the public Internet. While Stuxnet went from inside to outside, the inverse could easily have happened by connecting devices to internal and external networks.
WannaCry and its variants are recent larger-scale examples. Microsoft had issued patches for the SMBv1 vulnerability, eventually removing the protocol version from Windows. Still, some 200,000 computer systems were infected in over 150 countries worldwide to the tune of an estimated $4 billion in ransoms and damages. If human beings had updated their systems, we may never have added "WannaCry" to our security lexicon. At least we can use it as an example of the costs of negligence in cybersecurity curricula, right?
The infamous Equifax breach resulted in the compromise of the personal data from millions of people. The culprit? Equifax reported a vulnerability in Apache Struts that had already been patched by the Apache Software Foundation. This attack was described as a "relatively easy" hack, meaning it would not have required a highly skilled technician to execute this attack and compromise the 145 million or so batches of personal data. (PATCH!)
The lesson here? We care too much about gadgets and logical control systems, and not enough about our personnel. Increasing investments in people over more technology aids retention. The IT industry sees a lot of turnover. These decisions are not always about money, but salary is a consideration. Also, if companies were more willing to train their own employees, they would benefit from new skill sets without hiring new personnel. Employee retention enhances familiarity systems. Experienced personnel can quickly address issues. Time is money; downtime is loss of money.
Shallow End of the Hiring Pool
In every conversation I've had with hiring managers, and at every cybersecurity conference I've attended, there has been a common theme concerning the state of IT/IS: there is not enough talent in the hiring pool. But I'd argue that their organizations haven't shown enough willingness to train their own, provide their employees with the opportunity to learn and grow, and hire people they can teach. Too often, job boards are littered with postings chasing unicorns — mythical IT experts with a mix of experience that couldn't exist. If organizations would invest in their own people, they could mold someone into that magical creature rather than banging their head against the job board walls in search of a candidate that doesn't exist.
Invest in training and awareness for your day-to-day operational employees. Give them incentive to pay attention to the sender, content, and links of an email. Give them a sense of ownership of security, so they challenge an unfamiliar face in the hallway. Teach them techniques that attackers will use to socially engineer them, how they can smell a rat, and how they can thwart the attackers' efforts.
I'm not saying you should do away with all technical control systems, of course. However, when you continue to heap technology onto a system, you limit your hiring pool, and you're spreading your employees too thin. Don't create Jacks-and-Jills-of-all-trades. Create masters of yours.