Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Perimeter

3/19/2013
07:39 AM
Bruce Schneier
Bruce Schneier
Commentary
50%
50%

On Security Awareness Training

The focus on training obscures the failures of security design

Should companies spend money on security awareness training for their employees? It's a contentious topic, with respected experts on both sides of the debate. I personally believe that training users in security is generally a waste of time and that the money can be spent better elsewhere. Moreover, I believe that our industry's focus on training serves to obscure greater failings in security design.

In order to understand my argument, it's useful to look at training's successes and failures. One area where it doesn't work very well is health. We are forever trying to train people to have healthier lifestyles: eat better, exercise more, whatever. And people are forever ignoring the lessons. One basic reason is psychological: We just aren't very good at trading off immediate gratification for long-term benefit. A healthier you is an abstract eventually; sitting in front of the television all afternoon with a McDonald's Super Monster Meal sounds really good right now.

Similarly, computer security is an abstract benefit that gets in the way of enjoying the Internet. Good practices might protect me from a theoretical attack at some time in the future, but they’re a bother right now, and I have more fun things to think about. This is the same trick Facebook uses to get people to give away their privacy. No one reads through new privacy policies; it's much easier to just click "OK" and start chatting with your friends. In short: Security is never salient.

Another reason health training works poorly is that it’s hard to link behaviors with benefits. We can train anyone -- even laboratory rats -- with a simple reward mechanism: Push the button, get a food pellet. But with health, the connection is more abstract. If you’re unhealthy, then what caused it? It might have been something you did or didn’t do years ago. It might have been one of the dozen things you have been doing and not doing for months. Or it might have been the genes you were born with. Computer security is a lot like this, too.

Training laypeople in pharmacology also isn't very effective. We expect people to make all sorts of medical decisions at the drugstore, and they're not very good at it. Turns out that it's hard to teach expertise. We can't expect every mother to have the knowledge of a doctor, pharmacist, or RN, and we certainly can't expect her to become an expert when most of the advice she's exposed to comes from manufacturers' advertising. In computer security, too, a lot of advice comes from companies with products and services to sell.

One area of health that is a training success is HIV prevention. HIV may be very complicated, but the rules for preventing it are pretty simple. And aside from certain sub-Saharan countries, we have taught people a new model of their health and have dramatically changed their behavior. This is important: Most lay medical expertise stems from folk models of health. Similarly, people have folk models of computer security (PDF). Maybe they're right, and maybe they're wrong, but they're how people organize their thinking. This points to a possible way that computer security training can succeed. We should stop trying to teach expertise, pick a few simple metaphors of security, and train people to make decisions using those metaphors.

On the other hand, we still have trouble teaching people to wash their hands -- even though it’s easy, fairly effective, and simple to explain. Notice the difference, though. The risks of catching HIV are huge, and the cause of the security failure is obvious. The risks of not washing your hands are low, and it’s not easy to tie the resultant disease to a particular not-washing decision. Computer security is more like hand washing than HIV.

Another area where training works is driving. We trained, either through formal courses or one-on-one tutoring, and passed a government test to be allowed to drive a car. One reason that works is because driving is a near-term, really cool, obtainable goal. Another reason is even though the technology of driving has changed dramatically over the past century, that complexity has been largely hidden behind a fairly static interface. You might have learned to drive 30 years ago, but that knowledge is still relevant today.

On the other hand, password advice from 10 years ago isn't relevant today (PDF). Can I bank from my browser? Are PDFs safe? Are untrusted networks OK? Is JavaScript good or bad? Are my photos more secure in the cloud or on my own hard drive? The “interface” we use to interact with computers and the Internet changes all the time, along with best practices for computer security. This makes training a lot harder.

Food safety is my final example. We have a bunch of simple rules -- cooking temperatures for meat, expiration dates on refrigerated goods, the three-second rule for food being dropped on the floor -- that are mostly right, but often ignored. If we can’t get people to follow these rules, then what hope do we have for computer security training?

To those who think that training users in security is a good idea, I want to ask: "Have you ever met an actual user?" They're not experts, and we can’t expect them to become experts. The threats change constantly, the likelihood of failure is low, and there is enough complexity that it’s hard for people to understand how to connect their behaviors to eventual outcomes. So they turn to folk remedies that, while simple, don't really address the threats.

Even if we could invent an effective computer security training program, there's one last problem. HIV prevention training works because affecting what the average person does is valuable. Even if only half of the population practices safe sex, those actions dramatically reduce the spread of HIV. But computer security is often only as strong as the weakest link. If four-fifths of company employees learn to choose better passwords, or not to click on dodgy links, one-fifth still get it wrong and the bad guys still get in. As long as we build systems that are vulnerable to the worst case, raising the average case won't make them more secure.

The whole concept of security awareness training demonstrates how the computer industry has failed. We should be designing systems that won't let users choose lousy passwords and don't care what links a user clicks on. We should be designing systems that conform to their folk beliefs of security, rather than forcing them to learn new ones. Microsoft has a great rule about system messages that require the user to make a decision. They should be NEAT: necessary, explained, actionable, and tested. That's how we should be designing security interfaces. And we should be spending money on security training for developers. These are people who can be taught expertise in a fast-changing environment, and this is a situation where raising the average behavior increases the security of the overall system.

If we security engineers do our job right, then users will get their awareness training informally and organically from their colleagues and friends. People will learn the correct folk models of security and be able to make decisions using them. Then maybe an organization can spend an hour a year reminding their employees what good security means at that organization, both on the computer and off. That makes a whole lot more sense.

Bruce Schneier is chief security technology officer at BT, and the author of several security books as well as the Schneier On Security blog. Special to Dark Reading

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
<<   <   Page 2 / 2
George Ou
50%
50%
George Ou,
User Rank: Apprentice
3/19/2013 | 10:34:35 PM
re: On Security Awareness Training
So what you're basically saying is that unless someone devises an electric shock mechanism via bluetooth or USB interface that activates whenever the user does something stupid, then user training is probably useless.n++
Ben0xA
50%
50%
Ben0xA,
User Rank: Apprentice
3/19/2013 | 9:48:55 PM
re: On Security Awareness Training
Kelly, I talk a lot about how we changed our program and what happened with our users during my DerbyCon 2012 talk. It's a fundamental change to the user security awareness program. It's not about a 2 hour jam session once a year. It's about investing in the education of the user base. The trainings must die, but education needs to take it's place.

I talk about this in my blog post listed below in the comments as well as link to my DerbyCon talk.
kjhiggins
50%
50%
kjhiggins,
User Rank: Strategist
3/19/2013 | 9:10:49 PM
re: On Security Awareness Training
I have seen a lot of impassioned debate on this topic on Twitter today. I'd love to hear what readers have experienced with their internal user training programs, as well as the point about investing more in security training for developers.

Kelly Jackson Higgins, Senior Editor, Dark Reading
Ben0xA
50%
50%
Ben0xA,
User Rank: Apprentice
3/19/2013 | 6:48:44 PM
re: On Security Awareness Training
I respectfully disagree. My rebuttal ->-http://ben0xa.com/security-awa...
Jeff LoSapio
50%
50%
Jeff LoSapio,
User Rank: Apprentice
3/19/2013 | 5:41:18 PM
re: On Security Awareness Training
This debate continues with too many absolute positions. -Is anything in security absolute? -We spend a ton of money on AV, firewalls, IDS, etc... yet there are still virus infections and network breaches. -Should we throw our hands up and declare all of this technology worthless because bad things keep happening? -Whatever happened to the notion of defense-in-depth? -If we can agree that end users are vulnerabilities (or sometimes actual threats), then shouldn't we attempt to "remediate" the issue with training? -And yes, it's not 100% successful, but neither are the majority of technical security controls. -Risk management is about transferring, avoiding, or reducing negative impacts. -So if we can train a material percentage of end users to avoid risky behavior, then haven't we reduced risk? -And isn't that in the job description?

The problem with most security awareness programs is that they were designed to meet a compliance requirement, and not designed to be effective in changing employee behavior.

Who designed the program at your company? -Most likely a security manager or engineer who has no experience in communications, training, or content development. -Would you let someone from your HR group configure the firewall?-

How often is your awareness program updated? -Most likely once a year, if ever.

Do you have any quantifiable goals for the program? -Besides counting how many people attended a boring presentation or watched a boring CBT.

The media seems to love this debate, and yet there are rarely any articles about the benefits of awareness training or profiles of successul programs. --
<<   <   Page 2 / 2
News
Inside the Ransomware Campaigns Targeting Exchange Servers
Kelly Sheridan, Staff Editor, Dark Reading,  4/2/2021
Commentary
Beyond MITRE ATT&CK: The Case for a New Cyber Kill Chain
Rik Turner, Principal Analyst, Infrastructure Solutions, Omdia,  3/30/2021
Register for Dark Reading Newsletters
White Papers
Video
Cartoon
Current Issue
2021 Top Enterprise IT Trends
We've identified the key trends that are poised to impact the IT landscape in 2021. Find out why they're important and how they will affect you today!
Flash Poll
How Enterprises are Developing Secure Applications
How Enterprises are Developing Secure Applications
Recent breaches of third-party apps are driving many organizations to think harder about the security of their off-the-shelf software as they continue to move left in secure software development practices.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2021-30481
PUBLISHED: 2021-04-10
Valve Steam through 2021-04-10, when a Source engine game is installed, allows remote authenticated users to execute arbitrary code because of a buffer overflow that occurs for a Steam invite after one click.
CVE-2021-20020
PUBLISHED: 2021-04-10
A command execution vulnerability in SonicWall GMS 9.3 allows a remote unauthenticated attacker to locally escalate privilege to root.
CVE-2021-30480
PUBLISHED: 2021-04-09
Zoom Chat through 2021-04-09 on Windows and macOS allows certain remote authenticated attackers to execute arbitrary code without user interaction. An attacker must be within the same organization, or an external party who has been accepted as a contact. NOTE: this is specific to the Zoom Chat softw...
CVE-2021-21194
PUBLISHED: 2021-04-09
Use after free in screen sharing in Google Chrome prior to 89.0.4389.114 allowed a remote attacker to potentially exploit heap corruption via a crafted HTML page.
CVE-2021-21195
PUBLISHED: 2021-04-09
Use after free in V8 in Google Chrome prior to 89.0.4389.114 allowed a remote attacker to potentially exploit heap corruption via a crafted HTML page.