Insider Threats & Insider ObjectionsThe 'tyranny of the urgent' and three other reasons why it's hard for CISOs to establish a robust insider threat prevention program.
There’s no shortage of good coverage in the media on the important topic of insider threats. Yet despite the headlines, according to a 2018 report by CA , only 36% of companies surveyed say they had what they considered a mature insider threat program in place. So where is the disconnect? Based on my own experience and that of my CISO friends and colleagues, there are several factors that blunt attempts at establishing a robust insider threat program, among them: long “to do” lists, optics, privacy, mindshare and culture.
The never-ending "to do" list. If it’s not multifactor authentication (MFA), it’s endpoint detection and response (EDR). If it’s not EDR, it’s identity and access management (IAM). If it’s not IAM, it’s BYOD. You get the idea – every new threat (or acronym!) requires a custom-tailored solution, and the list of things to address keeps growing. Thus, CISOs, often caught by the tyranny of the urgent, are forced to make mindful but difficult tradeoffs regarding priorities. In that calculation, insider threat often doesn’t make the cut.
The problem of optics. Maybe you’ve taken a long look at your business and decided that the lack of an insider threat program is significant enough that you should address it. Good for you. Now you’ve got to get past the set of objections I file under the broad heading of "optics." Insider threat just sounds negative. For what it’s worth, I absolutely hate the name because it conjures up visions of shady characters skulking around the water cooler planning dark deeds, and that’s absolutely not how we want to view our coworkers.
While there are plenty of documented examples of employees "going to the dark side," the most effective insider threat programs are focused on protecting employees from themselves, each other, and attackers. The intent of the program is almost wholly positive … but the name is most definitely a negative.
It’s made worse by cultural issues. While I wish it were otherwise, it’s also best to admit that there are a lot of different relationships and dynamics that exist between employee and employer. Trust can be an issue, and in many companies, there exists a distinct sense of "them" and "us" that separates executive management from the workforce – something that acts to the detriment of trust. On top of that, you may have additional challenges from diverse cultural norms if you are a global company operating in different parts of the world. What’s okay in America may be anathema in Zimbabwe, and vice versa, ranging from muddled privacy regulations and employment laws to multi-department, multi-national tensions. Do you really want to jump into that?
The employee privacy issues. Users have legitimate concerns about how much they reveal of themselves to their employers. Not only are there ethical questions, but there are a mish-mash of laws that dictate what a company can and cannot do with respect to employee privacy. This becomes a really tough issue for CISOs.
Mindshare – or making the boss happy. The job of the CISO is not as simple as just protecting the company; it’s about making the boss happy – and that boss is ultimately the CEO or the board of directors. If the risks posed by insiders aren’t part of his or her mindshare, insider threat programs won’t look like a good investment. You can do some work to educate, but too often we find that we are faced with people whose minds are already made up.
Those are the objections. It’s your job to figure how to overcome them. But here are two suggestions:
First, be clear about the facts that justify the cost of an insider threat. The news is full of stories that chill to the bone with respect to misbehaving insiders. No company can afford to ignore these real-world incidents, and you can make the case about the damage they cause with hard numbers.
Second, take the optics issue head on. Start by having a real dialogue within the company about how programs like this are a force for good not evil. But make sure that your actions match your words. For example, a well-implemented program doesn’t actually have a negative impact on privacy. It’s all a matter of how you structure it. Yes, there is more up-front work required to do it right. But by putting in the effort, you can also make adoption of the program a way to meaningfully increase employee privacy as well as safety and security.
Dr. Richard Ford is the chief scientist for Forcepoint, overseeing technical direction and innovation throughout the business. He brings over 25 years' experience in computer security, with knowledge in both offensive and defensive technology solutions. During his career, ... View Full Bio