Operations
6/11/2014
12:00 PM
Connect Directly
Twitter
LinkedIn
RSS
E-Mail
100%
0%

Don’t Let Lousy Teachers Sink Security Awareness

You can't fix a human problem with a technology solution. Here are three reasons why user education can work and six tips on how to develop a corporate culture of security.

I strongly believe that end-user awareness training is a very important part of a defense-in-depth security strategy. While we need technological controls, controls will never catch everything -- and social engineers will always find new ways to trick users into doing things they shouldn't.

The bottom line is that you can't fix a human problem with a technology solution. You need to train a culture of security.

Unfortunately, a significant portion of the InfoSec community -- including some security gurus I respect greatly -- disagree with me on this. They believe end-user education is worthless. Their arguments are wrong and here's why:

Argument No. 1: Even if training reduces bad user behavior, a mistake from one bad egg still lets threats in. This is the most inane argument against security training I've ever heard. If you are a security professional, you understand that no security control is invulnerable.

No, training will not make your users faultless security ninjas who never make mistakes, but your technical controls don't do that either. Training will, however, lower the number of mistakes users make, which lessens the pressure down the line for your technical security controls and your incident response team.

Argument No. 2: Average people don't care about security; it's too abstract of a problem. The InfoSec problem is only abstract to the people who are uninformed about the issue. The whole point of training is to inform them. It takes time to change culture, and a shift towards better InfoSec awareness is a culture change, but training does work.

Argument No. 3: Users are just ignorant lay people who don't get it; they'd have to be experts to really understand and it's just too hard to make them experts. To me, that argument is the crux of the problem. While, admittedly, this is a gross overgeneralization, a large part of the IT community seems to trivialize the intelligence and potential of the average end-user.

If you've been in the IT profession for a while, you've probably heard terms like PEBCAK (Problem Exists Between Chair And Keyboard) and luser (a users who is also a loser), or you've heard phrases like, "You can't patch stupid," or, "It's a layer eight problem." I believe over time these sorts of jokes have slowly poisoned our community into assuming the average end-user is clueless and stupid. This couldn't be further from the truth.

It's not that IT professionals don't want to be inclusive -- and really they do share their knowledge and insight. It's just that we are so used to talking to peers using our succinct, albeit harsh, shorthand, that we forget what it was like to not understand it. This makes IT or InfoSec pros lousy teachers.

The good news is it's easy to change. You can start by following six simple tips that should help improve your security awareness training success rate.

Tip No. 1: Get users on your team. Often, corporate security training comes off as, "You need to be a good employee and protect the company, and here are all the draconian rules." Rather, you should highlight how this security training directly benefits the users themselves. For example, the same InfoSec practices that help protect your company will also help employees at home. If they realize the personal benefits of this sort of training, I think you'll find they'll be much more willing to use them at work as well.

Tip No. 2: Simplify your goals and messages. Training is not about making end-users InfoSec experts. It's about sharing just enough information to foster some key behaviors. In other words, if you are training them about buffer overflows flaws, you're doing it wrong. Instead, you should be training them about how to recognize phishing emails or how to interact with unsolicited attachments. In the end, you want them to know enough about the potential problem that they will adopt the right behavior.

Tip No. 3: Don't spout acronyms without explanation. In short, don't speak in the same shorthand you use with peers. Even if you think a term or acronym is well recognized, spend the extra minute to explain it.

Tip No. 4: Examples, anecdotes, metaphors. When you are teaching security awareness, find a way to ground the subject with real examples. For my training presentation, I'm known for throwing in some sort of actual attack or "hacking" demo. You may not have the time or resources for a full demo, but you can at least share sample phishing emails, or tell stories about actual malware or attacks.

Tip No. 5: Make learning fun and interactive. There are many way to make training fun. For example, break the group into teams, give them some email samples and award a prize to the team that identifies the most potentially malicious emails. I know security is a serious subject, but if you get the group interacting and laughing, they'll be more open to the serious advice you give them.

Tip No. 6: Creating a security culture takes time. Finally, don't expect complete change overnight. Everyone wants an easy fix. Thinking you can give one presentation that will eliminate users from ever clicking on a phishing email link is not a realistic expectation. With new employees, and changes in the threat landscape, you will have to redo and update trainings a few times a year.

In my opinion, end-user security training is worth it, despite what some naysayers might claim. There's even data to support that it works. However, not all training is created equal. If we are inclusive and show passion in what we share, I think you'll find the average end-user can be converted into a resilient InfoSec neophyte, making your job a bit easier.

Corey Nachreiner regularly contributes to security publications and speaks internationally at leading industry trade shows like RSA. He has written thousands of security alerts and educational articles and is the primary contributor to the WatchGuard Security Center blog, ... View Full Bio
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Page 1 / 2   >   >>
Marilyn Cohodas
50%
50%
Marilyn Cohodas,
User Rank: Strategist
6/16/2014 | 4:11:50 PM
Re: Excellent Review
It's great that you have such a positive -- and long-term view -- of the issues. It sounds like you are up to the challenge. Thanks for sharing.
SecOpsSpecialist
50%
50%
SecOpsSpecialist,
User Rank: Apprentice
6/16/2014 | 4:03:49 PM
Re: Excellent Review
Marilyn,

My successes come from a variety of places. In many places, it comes from the fact that I understand learning and culture change is a process. Due to various NDAs as well as privacy agreements, I cannot share the names of the companies.

Many people when they go to implement something that will change culture, they find struggles in changing it because often times, it is expected to happen overnight. As Cory said in his article, change does not happen overnight.

One of the challenges I actually face on a day-to-day basis. We have people in the organization who do not take the security program seriously and tend to either ignore the message we are sending out or they scan it and then toss it aside because they do not believe it applies to them. As part of my job it is to help these individuals see that while it is important and it does apply to them, there's more to it than just rules and regulations; that these are in place for a reason, not just to make their life more difficult.

I do look forward to continuing to grow my security program here at the organization where I am employed as the individuals I work with are fantastic. Perhaps a bit stubborn, but that's to be expected with culture change.
Marilyn Cohodas
50%
50%
Marilyn Cohodas,
User Rank: Strategist
6/16/2014 | 3:32:42 PM
Re: Excellent Review
@SecOpsSpecialist Where have you found your successes in creating a security culture? I'd love to hear about your victories -- and also some of your challenges.
SecOpsSpecialist
50%
50%
SecOpsSpecialist,
User Rank: Apprentice
6/16/2014 | 12:20:34 PM
Excellent Review
Cory -

You have a fabulous article here and I found myself nodding along and agreeing with you. As the Security Awareness person for my organization, I often find myself in this same position. It's a mandate that users lock their workstations before they walk away from them, but there are some, who still forget to do it. We can remind them only so much before we have to show them the error and have them realize what can actually happen because they've left the machine unprotected.


You are absolutely right when you say that security culture cannot change overnight, especially in an organization where there's a mixture of the newer blood and the older blood. I sincerely hope that more people pay attention to this, especially those who are trying to start a security awareness program at their place of employment.
Marilyn Cohodas
50%
50%
Marilyn Cohodas,
User Rank: Strategist
6/12/2014 | 9:56:24 AM
PEBCAK & luser
I can't tell you how frustrating it is -- as an end-user -- when the assumption from the technical team is that the problem is a result of operator (luser) error. If the technology worked flawlessly a lot of of IT people would be out of a job! In order to fulfill Security Awareness Tip No. 1: Get users on your team you'll need to treat users as real people (not PEBCAKs) with something between their ears.
dwatson777
100%
0%
dwatson777,
User Rank: Apprentice
6/11/2014 | 6:11:08 PM
Great Article!
Great Article.  I agree.
Randy Naramore
100%
0%
Randy Naramore,
User Rank: Ninja
6/11/2014 | 4:40:27 PM
Re: tips
Good post. Very interesting read.
CoreyNach
100%
0%
CoreyNach,
User Rank: Apprentice
6/11/2014 | 3:54:08 PM
Re: Tough Material
Wow... thanks for your thorough comments. It sounds like you have a lot of practical advice from first hand experience....

On the idea of having consequences to breaking policy part... I think there is a middle ground. First, I agree that you need both training and technical security controls.... That's my point. The best training won't make ppl perfect, so you still need to audit, but the best technical security measures are not infallible... together they reinforce each other. Also, I do agree that your organizations security policy should have some potential teeth... meaning employees should understand that major breaks in policy could result in termination. And the employee should be held accountable, meaning at the end of a training, they should somehow acknowledge that they understand the policies that were communicated to them (signing something)... but that said, I do believe you can communicated these policies in a way that the employees understand what's at stake. Rather than an attitude of, "here's the rules, follow them or else," you can adopt a tone of, "here's some serious problems, and here's how they can cost our business, and all of us, money and heartache... here are some rules that you should follow to avoid these issue, and by the way, if you follow these rules at home, you might avoid issues there as well. We do enforce these rules, and will hold you accountable to them, but they really are in your best interest."

 

Anyway, sounds like we both agree, but I think you can deliver these sorts of policies in a way that comes down less harsh, and will still result is as much adoption of whatever practice you are teaching...
CoreyNach
50%
50%
CoreyNach,
User Rank: Apprentice
6/11/2014 | 3:44:13 PM
Re: tips
Thanks... I was recently reminded what it is like to be new to a subject that has it's own language. I started a new hobby, and joined a forum that talked about (aerial videography with multicopters), and the forum members had a ton of acryonms and terms of their own. I could not understand half the posts until I figured out a ton of new acronyms... So this experience really drove that tip home for me. ^_^
Christian Bryant
100%
0%
Christian Bryant,
User Rank: Ninja
6/11/2014 | 2:54:00 PM
Tough Material
@Corey Nachreiner

First, kudos on a thorough article.  It's a fine collection of tips.  I'd like to add a few notes of my own.  I've been in IT, specifically build/release management, for 15+ years, and security has always been the secret passion.  Because of that, it is always part of my auditing documentation.  Also, I write howtos and other documentation for staff, so I have a special interest in training, but also methods for ensuring retention of information. 

Argument No. 1:  One or more bad eggs can and do cause significant damage.  This is why a two-pronged approach to security is needed:  1)  Build the technical infrastructure needed to prevent internal and external security risk, accompanied by the right organizational processes (checks and balances), and 2) train users thoroughly both in terms of "best practices", common mistakes, and so forth, but ALSO remind them the seriousness of aiding in security abuse, knowingly or otherwise.  I think that right there is one major shortcoming in user training:  Put the fear of legal response and termination into everyone; sounds harsh but you know that Snowden's example has set in motion process and technology audits like nothing seen in that department in years.  This is serious stuff. 

Argument No. 2:  To my notes above, the average person WILL care about security once they realize they can be held accountable, and that abuse of security protocols is punishable in no small way. 

Argument No. 3:  Surprise!  Those archaic references are now becoming obsolete with more average users becoming tech savvy, partly because the population of users is younger and tech has been at their fingertips since childhood.  My daughter isn't interested in tech as a profession, but at 7 years-old she has her own Debian GNU/Linux computer, uses LibreOffice regularly and pointed out technical work-arounds in TuxPaint I hadn't thought of.  Any IT staff that are dumbing down or not trying to educate based upon assumptions on the end user are going to have a very unsuccessful career ahead of them. 

Tip No. 1:  Based upon my previous comments, you can guess I'm half on the fence here.  I do believe in the draconian rules, to some extent.  Fear of legal punishment is what put me on a straight and narrow path when I was a young man.  But at the same time, I believe that personalizing the benefits of security are key, too.  Billions of dollars are taken from innocent people through cyber crime and in the end, we _are_ here to make life better for the average person.   

Tip No. 2:  Absolutely agree.  And do it with simple graphics in a brief presentation or video.  YouTube is king when it comes to training! 

Tip No. 3:  And the same holds true with documentation.  Always explode the acronym first, before switching to it in later parts; i.e. "Open Web Application Security Project" (OWASP) has a MeetUp.  Join the OWASP MeetUp today." 

Tip No. 4:  I find tying your example to cyber crime news that makes network news works really well.  Heartbleed was good for that because it was all over CNN, MSNBC, CSPAN, and major networks.  Snowden (how he did it, not why) is also a good example.  Use recognizable examples - saves you time to recreate the hack yourself. 

Tip No. 5:  Say no more. I have kids!  On a serious note, though, you need to also remind folks that they are their colleague's keeper when it comes to security.  Incentives for whistle blowers, while it may leave a bad taste in the mouth, might be necessary.  Taking the game from a friendly group competition that is visible to an internal game where bad behaviour is recognized and privately reported for gain is sometimes what it takes to keep employees from joining together to commit crime, or from ignoring signs of criminal behavior they witness.   

Tip No. 6:  Every company is different and ultimately, you may have to choose between a visible security team and an invisible one.  When folks forget there are security personnel onsite, auditing traffic and observing video sessions, they slip and make mistakes.  Someone who is intent on committing a crime is going to do it, but only when they feel safe to do so.  The initial training and fear of reprisal is a necessity, but at what point do you decide that the fun and games approach to security needs to go out the window and maintaining a quiet, efficient and hard-hitting security audit team makes more sense?
Page 1 / 2   >   >>
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Flash Poll
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2013-2595
Published: 2014-08-31
The device-initialization functionality in the MSM camera driver for the Linux kernel 2.6.x and 3.x, as used in Qualcomm Innovation Center (QuIC) Android contributions for MSM devices and other products, enables MSM_CAM_IOCTL_SET_MEM_MAP_INFO ioctl calls for an unrestricted mmap interface, which all...

CVE-2013-2597
Published: 2014-08-31
Stack-based buffer overflow in the acdb_ioctl function in audio_acdb.c in the acdb audio driver for the Linux kernel 2.6.x and 3.x, as used in Qualcomm Innovation Center (QuIC) Android contributions for MSM devices and other products, allows attackers to gain privileges via an application that lever...

CVE-2013-2598
Published: 2014-08-31
app/aboot/aboot.c in the Little Kernel (LK) bootloader, as distributed with Qualcomm Innovation Center (QuIC) Android contributions for MSM devices and other products, allows attackers to overwrite signature-verification code via crafted boot-image load-destination header values that specify memory ...

CVE-2013-2599
Published: 2014-08-31
A certain Qualcomm Innovation Center (QuIC) patch to the NativeDaemonConnector class in services/java/com/android/server/NativeDaemonConnector.java in Code Aurora Forum (CAF) releases of Android 4.1.x through 4.3.x enables debug logging, which allows attackers to obtain sensitive disk-encryption pas...

CVE-2013-6124
Published: 2014-08-31
The Qualcomm Innovation Center (QuIC) init scripts in Code Aurora Forum (CAF) releases of Android 4.1.x through 4.4.x allow local users to modify file metadata via a symlink attack on a file accessed by a (1) chown or (2) chmod command, as demonstrated by changing the permissions of an arbitrary fil...

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
This episode of Dark Reading Radio looks at infosec security from the big enterprise POV with interviews featuring Ron Plesco, Cyber Investigations, Intelligence & Analytics at KPMG; and Chris Inglis & Chris Bell of Securonix.