Perimeter
3/19/2013
07:39 AM
Bruce Schneier
Bruce Schneier
Commentary
50%
50%

On Security Awareness Training

The focus on training obscures the failures of security design

Should companies spend money on security awareness training for their employees? It's a contentious topic, with respected experts on both sides of the debate. I personally believe that training users in security is generally a waste of time and that the money can be spent better elsewhere. Moreover, I believe that our industry's focus on training serves to obscure greater failings in security design.

In order to understand my argument, it's useful to look at training's successes and failures. One area where it doesn't work very well is health. We are forever trying to train people to have healthier lifestyles: eat better, exercise more, whatever. And people are forever ignoring the lessons. One basic reason is psychological: We just aren't very good at trading off immediate gratification for long-term benefit. A healthier you is an abstract eventually; sitting in front of the television all afternoon with a McDonald's Super Monster Meal sounds really good right now.

Similarly, computer security is an abstract benefit that gets in the way of enjoying the Internet. Good practices might protect me from a theoretical attack at some time in the future, but they’re a bother right now, and I have more fun things to think about. This is the same trick Facebook uses to get people to give away their privacy. No one reads through new privacy policies; it's much easier to just click "OK" and start chatting with your friends. In short: Security is never salient.

Another reason health training works poorly is that it’s hard to link behaviors with benefits. We can train anyone -- even laboratory rats -- with a simple reward mechanism: Push the button, get a food pellet. But with health, the connection is more abstract. If you’re unhealthy, then what caused it? It might have been something you did or didn’t do years ago. It might have been one of the dozen things you have been doing and not doing for months. Or it might have been the genes you were born with. Computer security is a lot like this, too.

Training laypeople in pharmacology also isn't very effective. We expect people to make all sorts of medical decisions at the drugstore, and they're not very good at it. Turns out that it's hard to teach expertise. We can't expect every mother to have the knowledge of a doctor, pharmacist, or RN, and we certainly can't expect her to become an expert when most of the advice she's exposed to comes from manufacturers' advertising. In computer security, too, a lot of advice comes from companies with products and services to sell.

One area of health that is a training success is HIV prevention. HIV may be very complicated, but the rules for preventing it are pretty simple. And aside from certain sub-Saharan countries, we have taught people a new model of their health and have dramatically changed their behavior. This is important: Most lay medical expertise stems from folk models of health. Similarly, people have folk models of computer security (PDF). Maybe they're right, and maybe they're wrong, but they're how people organize their thinking. This points to a possible way that computer security training can succeed. We should stop trying to teach expertise, pick a few simple metaphors of security, and train people to make decisions using those metaphors.

On the other hand, we still have trouble teaching people to wash their hands -- even though it’s easy, fairly effective, and simple to explain. Notice the difference, though. The risks of catching HIV are huge, and the cause of the security failure is obvious. The risks of not washing your hands are low, and it’s not easy to tie the resultant disease to a particular not-washing decision. Computer security is more like hand washing than HIV.

Another area where training works is driving. We trained, either through formal courses or one-on-one tutoring, and passed a government test to be allowed to drive a car. One reason that works is because driving is a near-term, really cool, obtainable goal. Another reason is even though the technology of driving has changed dramatically over the past century, that complexity has been largely hidden behind a fairly static interface. You might have learned to drive 30 years ago, but that knowledge is still relevant today.

On the other hand, password advice from 10 years ago isn't relevant today (PDF). Can I bank from my browser? Are PDFs safe? Are untrusted networks OK? Is JavaScript good or bad? Are my photos more secure in the cloud or on my own hard drive? The “interface” we use to interact with computers and the Internet changes all the time, along with best practices for computer security. This makes training a lot harder.

Food safety is my final example. We have a bunch of simple rules -- cooking temperatures for meat, expiration dates on refrigerated goods, the three-second rule for food being dropped on the floor -- that are mostly right, but often ignored. If we can’t get people to follow these rules, then what hope do we have for computer security training?

To those who think that training users in security is a good idea, I want to ask: "Have you ever met an actual user?" They're not experts, and we can’t expect them to become experts. The threats change constantly, the likelihood of failure is low, and there is enough complexity that it’s hard for people to understand how to connect their behaviors to eventual outcomes. So they turn to folk remedies that, while simple, don't really address the threats.

Even if we could invent an effective computer security training program, there's one last problem. HIV prevention training works because affecting what the average person does is valuable. Even if only half of the population practices safe sex, those actions dramatically reduce the spread of HIV. But computer security is often only as strong as the weakest link. If four-fifths of company employees learn to choose better passwords, or not to click on dodgy links, one-fifth still get it wrong and the bad guys still get in. As long as we build systems that are vulnerable to the worst case, raising the average case won't make them more secure.

The whole concept of security awareness training demonstrates how the computer industry has failed. We should be designing systems that won't let users choose lousy passwords and don't care what links a user clicks on. We should be designing systems that conform to their folk beliefs of security, rather than forcing them to learn new ones. Microsoft has a great rule about system messages that require the user to make a decision. They should be NEAT: necessary, explained, actionable, and tested. That's how we should be designing security interfaces. And we should be spending money on security training for developers. These are people who can be taught expertise in a fast-changing environment, and this is a situation where raising the average behavior increases the security of the overall system.

If we security engineers do our job right, then users will get their awareness training informally and organically from their colleagues and friends. People will learn the correct folk models of security and be able to make decisions using them. Then maybe an organization can spend an hour a year reminding their employees what good security means at that organization, both on the computer and off. That makes a whole lot more sense.

Bruce Schneier is chief security technology officer at BT, and the author of several security books as well as the Schneier On Security blog. Special to Dark Reading

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
<<   <   Page 2 / 2
George Ou
50%
50%
George Ou,
User Rank: Apprentice
3/19/2013 | 10:34:35 PM
re: On Security Awareness Training
So what you're basically saying is that unless someone devises an electric shock mechanism via bluetooth or USB interface that activates whenever the user does something stupid, then user training is probably useless.n++
Ben0xA
50%
50%
Ben0xA,
User Rank: Apprentice
3/19/2013 | 9:48:55 PM
re: On Security Awareness Training
Kelly, I talk a lot about how we changed our program and what happened with our users during my DerbyCon 2012 talk. It's a fundamental change to the user security awareness program. It's not about a 2 hour jam session once a year. It's about investing in the education of the user base. The trainings must die, but education needs to take it's place.

I talk about this in my blog post listed below in the comments as well as link to my DerbyCon talk.
kjhiggins
50%
50%
kjhiggins,
User Rank: Strategist
3/19/2013 | 9:10:49 PM
re: On Security Awareness Training
I have seen a lot of impassioned debate on this topic on Twitter today. I'd love to hear what readers have experienced with their internal user training programs, as well as the point about investing more in security training for developers.

Kelly Jackson Higgins, Senior Editor, Dark Reading
Ben0xA
50%
50%
Ben0xA,
User Rank: Apprentice
3/19/2013 | 6:48:44 PM
re: On Security Awareness Training
I respectfully disagree. My rebuttal ->-http://ben0xa.com/security-awa...
Jeff LoSapio
50%
50%
Jeff LoSapio,
User Rank: Apprentice
3/19/2013 | 5:41:18 PM
re: On Security Awareness Training
This debate continues with too many absolute positions. -Is anything in security absolute? -We spend a ton of money on AV, firewalls, IDS, etc... yet there are still virus infections and network breaches. -Should we throw our hands up and declare all of this technology worthless because bad things keep happening? -Whatever happened to the notion of defense-in-depth? -If we can agree that end users are vulnerabilities (or sometimes actual threats), then shouldn't we attempt to "remediate" the issue with training? -And yes, it's not 100% successful, but neither are the majority of technical security controls. -Risk management is about transferring, avoiding, or reducing negative impacts. -So if we can train a material percentage of end users to avoid risky behavior, then haven't we reduced risk? -And isn't that in the job description?

The problem with most security awareness programs is that they were designed to meet a compliance requirement, and not designed to be effective in changing employee behavior.

Who designed the program at your company? -Most likely a security manager or engineer who has no experience in communications, training, or content development. -Would you let someone from your HR group configure the firewall?-

How often is your awareness program updated? -Most likely once a year, if ever.

Do you have any quantifiable goals for the program? -Besides counting how many people attended a boring presentation or watched a boring CBT.

The media seems to love this debate, and yet there are rarely any articles about the benefits of awareness training or profiles of successul programs. --
<<   <   Page 2 / 2
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: Janice, I think I've got a message from the code father!
Current Issue
Security Operations and IT Operations: Finding the Path to Collaboration
A wide gulf has emerged between SOC and NOC teams that's keeping both of them from assuring the confidentiality, integrity, and availability of IT systems. Here's how experts think it should be bridged.
Flash Poll
New Best Practices for Secure App Development
New Best Practices for Secure App Development
The transition from DevOps to SecDevOps is combining with the move toward cloud computing to create new challenges - and new opportunities - for the information security team. Download this report, to learn about the new best practices for secure application development.
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2017-0290
Published: 2017-05-09
NScript in mpengine in Microsoft Malware Protection Engine with Engine Version before 1.1.13704.0, as used in Windows Defender and other products, allows remote attackers to execute arbitrary code or cause a denial of service (type confusion and application crash) via crafted JavaScript code within ...

CVE-2016-10369
Published: 2017-05-08
unixsocket.c in lxterminal through 0.3.0 insecurely uses /tmp for a socket file, allowing a local user to cause a denial of service (preventing terminal launch), or possibly have other impact (bypassing terminal access control).

CVE-2016-8202
Published: 2017-05-08
A privilege escalation vulnerability in Brocade Fibre Channel SAN products running Brocade Fabric OS (FOS) releases earlier than v7.4.1d and v8.0.1b could allow an authenticated attacker to elevate the privileges of user accounts accessing the system via command line interface. With affected version...

CVE-2016-8209
Published: 2017-05-08
Improper checks for unusual or exceptional conditions in Brocade NetIron 05.8.00 and later releases up to and including 06.1.00, when the Management Module is continuously scanned on port 22, may allow attackers to cause a denial of service (crash and reload) of the management module.

CVE-2017-0890
Published: 2017-05-08
Nextcloud Server before 11.0.3 is vulnerable to an inadequate escaping leading to a XSS vulnerability in the search module. To be exploitable a user has to write or paste malicious content into the search dialogue.

Dark Reading Radio
Archived Dark Reading Radio
In past years, security researchers have discovered ways to hack cars, medical devices, automated teller machines, and many other targets. Dark Reading Executive Editor Kelly Jackson Higgins hosts researcher Samy Kamkar and Levi Gundert, vice president of threat intelligence at Recorded Future, to discuss some of 2016's most unusual and creative hacks by white hats, and what these new vulnerabilities might mean for the coming year.