A counterpoint to Bruce Schneier's recent post on security awareness training for users

Ira Winkler, Field CISO & Vice President, CYE

March 25, 2013

6 Min Read

When I read Bruce Schneier's recent blogbasically stating security awareness is a waste of resources, I perceived a general misconception about the fundamental concepts of security awareness that are actually very critical to the discipline of awareness and security as a whole. This misconception actually highlights why many security awareness programs suck.

Bruce uses the term "security awareness training." There is a very distinct difference between "Security Awareness" and "Security Training."Security training provides users with a finite set of knowledge and usually tests for short-term comprehension. The once a year, 10-minute videos that auditors shortsightedly approve as a security awareness program is an example of such training. These are simply "Check the Box" efforts that are admittedly useless, except to waste time and develop a disdain for security in the minds of the average user.

Security Awareness programs strive to change behaviors of individuals, which in turn strengthens the security culture. Awareness is a continual process. It is not a program to tell people to be afraid to check their e-mail. The discipline requires a distinct set of knowledge, skills, and abilities.

More important is that security is about mitigating risk. There is no such thing as a perfect security countermeasure and there never will be. Every technology or security scheme will, or at least can, be bypassed. This is why security professionals advocate defense-in-depth, knowing that you cannot rely upon any single countermeasure. A security program involves a holistic program of countermeasures designed to protect, detect, and react to incidents.

The question then becomes whether security awareness is a cost-effective countermeasure that saves more money than it costs. This is admittedly difficult, because as with all security countermeasures, it is hard to measure the incidents that you prevent. Additionally, few security awareness programs take metrics. There are, however, many security awareness success stories, and I can refer you to Mitre’s site of security awareness successes.Likewise, everyone reading this article knows of many cases where an incident was avoided due to secure behaviors.

To that point, I will address Bruce’s argument that even if 4/5 of incidents are prevented, the bad guys still get in. That argument basically says that if the bad guy gets in, all security countermeasures are irrelevant. By that measure, we should abandon security as a whole, since all countermeasures have and will fail.

Unless Bruce has a way to provide perfect security, organizational security programs must implement a program that acknowledges that failures will happen, and determine the most cost-effective strategies to mitigate loss through prevention, detection, and reaction. Security awareness is a critical part of that strategy for most organizations, especially the ones with the most to lose.

There are several other issues that Bruce’s arguments don’t address. He essentially argues that security is about preventing malicious parties from getting in and that once a bad actor is in, all is lost. The reality is that the greatest security related losses result from people with legitimate access. Insiders doing things maliciously, and more often innocently, create the most significant losses.Security awareness helps well-meaning insiders determine when to report a coworker who is potentially doing something maliciously. Likewise, only awareness will stop an employee from taking actions that are not malicious and allowed as a normally legitimate business purpose, but otherwise harmful.

Another issue is that Bruce's blog only addresses computer security. The arguments for technology-based solutions for user failings do nothing to stop non-computer related risks, or even risks related to practical office environments. Non-computer related losses include documents that are left unattended, improperly discarded materials, etc., cannot be stopped by better programmers.

I also have to take exception to Bruce’s statement, "'Have you ever talked to a user? They're not experts.'" That is the attitude that causes a rift between security professionals and the general population. Fundamentally demonstrating a lack of respect for users creates a divisive environment. While there are clearly exceptions, most users are well-meaning and competent when asked to take basic security precautions and provided with the proper guidance.

To the fact that the users are not experts, I have major issues with Bruce’s description of the medical profession. The average person is clearly not a medical professional, but they know how to treat basic medical conditions that are infinitely more common than a condition requiring professional attention. People know that when they have congestion, they can start treatment by taking a decongestant. They know that when they have a basic cut, they wash it and put on a bandage. They know that when they have a headache, they take a painkiller. Likewise, the average user is more than capable of taking care of the majority of security-related issues, if they are made aware of the appropriate behaviors.

I also have to take special exception with what Bruce essentially describes as the replacement for security awareness; 1) Designing systems that prevent users from making security related mistakes, 2) by enabling folk models of security.

Let's first address the "folk models of security." There is no consensus of security folk models, nor does it mean a folk model should be supported. Since Bruce uses HIV as an example, a folk model throughout Africa that having sex with a virgin will cure AIDS inhibits HIV awareness efforts. Another false folk model is Bruce's stated belief that "The Three Second Rule" is a valid food safety practice. While implementing security in a way that is commonly accepted is a valid goal, the fundamental issue is that you cannot rely on people teaching each other safe computing practices.

In the absence of security awareness, Bruce advocates that developers learn to design systems that are secure against user actions. That is delusional: Developers have yet to learn to write software that is secure against technical attacks. It is completely unrealistic to expect programmers to make software secure against all non-technical attacks as well. This is the high tech equivalent of saying that automobile companies should immediately stop spending money installing seat belts and to try to create cars that reliably drive themselves.

Software that limits the potential damage users can cause would be valuable, however you can’t reduce another element of defense-in-depth, whether it is security awareness, anti-virus software, vulnerability scanning, etc., waiting for that solution to magically arrive.

Finally, the most important issue is that security awareness is not an option for most organizations. A variety of organizations that have a lot of money and information at stake, such as the payment card industry, have conducted extensive investigations and determined that a significant portion of their losses come from human failings. While admittedly many of the resulting programs are poor, following Bruce's advice is clearly not an option.

What is needed is for security professionals to understand that the security awareness discipline requires its own knowledge, skills, and abilities. A competent, or even expert, security practitioner is not a competent security awareness practitioner by default. Organizations need to seek people out, or train people, so they can implement effective awareness programs, and realize some of the highest returns on security investments.

While I acknowledge that many security awareness programs are bad, there are many incredibly effective security awareness programs. I also acknowledge that even the best awareness programs will have their failures, just like every other security countermeasure. It is, however, absurd to hold security awareness to a standard that is higher than the standard for any other security countermeasure, especially when a good awareness program has such a comparatively low cost, and the alternative advocated amounts to a fantasy.

Ira Winkler, CISSP is President at Secure Mentem, and the author of several security books including Spies Among Us. Special to Dark Reading

 

About the Author(s)

Ira Winkler

Field CISO & Vice President, CYE

Ira Winkler, CISSP, is the Director of the Human Security Engineering Consortium and author of the books You Can Stop Stupid and Security Awareness for Dummies. He is considered one of the world’s most influential security professionals and was named “The Awareness Crusader” by CSO Magazine in receiving its CSO COMPASS Award. He has designed, implemented, and supported security awareness programs at organizations of all sizes, in all industries, around the world. Ira began his career at the National Security Agency, where he served in various roles as an Intelligence and Computer Systems Analyst. He has since served in other positions supporting the cybersecurity programs in organizations of all sizes.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights