Or, how to blunt the bad guys' advantage and incent users to make smarter choices

Dark Reading Staff, Dark Reading

February 15, 2007

7 Min Read

When I get into discussions with peers on people and security, I’m often struck by how little education we've received on human behavior, both in school and out. This lack of knowledge goes a long way in explaining why so many managers can't manage, so many marketing people can't create demand, and so many security professionals throw up their hands when it comes to getting people (employees, executives, customers, vendors, and security personnel) to do their part.

The latter is very troubling right now because the fastest growing and arguably most successful attack vector for a cybercrime is tricking people into providing information that allows criminals access to critical resources. This means that while security professionals aren't focused on behavior, criminals are, and, if you think about telephone scams and confidence schemes, have always been.

This is because criminals have mastered manipulation. If they can be successful in getting folks to do things that are clearly not in their best interests, why can’t security specialists use similar skills to get people to do what is? As desperately as security specialists need that skill set, they simply don't have it.

Security and Pathology
I've come to the realization that security experts need to spend some time studying human behavior, specifically the causes of bad behavior, how to encourage appropriate behavior, and how to discourage inappropriate behavior. In addition, we should also focus on awareness of behavior as an indicator of problems to come. This would better prepare the security services inside a company for problems related to disgruntled employees, which can, if they turn violent, easily eclipse the external threats we focus on more.

If people are both our greatest asset and our greatest exposure, then protecting them -- even from themselves -- becomes our highest priority. And if we don't develop the skills to do that, isn't that as negligent as leaving doors unlocked or firewalls powered off in the face of escalating physical and cyber attacks?

If we can motivate the people who work for us to learn how to motivate others to do the right thing how are we then going to be able to get ahead of the current increasing wave of attacks targeted at those same people? More important, don’t you wonder how often a security professional is tricked into creating a security exposure? Such things are seldom spoken of, but there is no inherent intelligence advantage just because you have “security” on your shirt.

Obviously, you have to get users' attention before you can motivate them to do anything. Folks have massive amounts of distractions and priorities and, unfortunately, security just doesn't figure highly in their awareness. Oh, certainly if their identity is stolen (or a friend's or co-worker's) they will pay attention for a while and will even be a little more vigilant. But, in time, they will go back to the way things were and become just as vulnerable, and from your perspective, just as big a potential security problem as they once were.

Here's the only method I know that works: a combination of rewards, penalties, and constant testing. For employees who are actively trying to prevent problems this should count toward their year-end incentives; for those that are negligent, they risk losing part or all of those same incentives and get to spend some embarrassing personal time in front of their manager's manager.

For executives, who have to set the standard, the penalties should be even tougher. Why? Because they set the behavior standard and because executives generally misuse their authority to create security problems. You want to make it incredibly clear that such moves are not consistent with long term successful careers.

All this should be tested by actual attempted attacks designed to get an employee to give up critical information. If they screw up with an outsider they may not know they screwed up and report it, or if they do figure it out, they may not report the problem for fear of getting in trouble.

Attempts to compromise an employee should be done as part of an unannounced security audit and penalties for giving out information in error should be vastly lower if the employee reports the problem than if that problem is discovered and tracked back to the employee. This is the same method used to keep people from abandoning the scene of a car accident; there's no reason why it shouldn't work here.

Customer Vigilance and eBay
I don't use eBay because I can no longer verify if its email is legitimate. In fact, now that I don't use eBay, I know the massive amount of email from supposed sellers and buyers from the site is fake because I neither sell nor buy using the service. I used to report this mail but it took too long to figure out how to do it right, and when I did, there was no sense it went any place but some big corporate trash bin in the sky.

Just to see if it had changed, I tried it again today. It still takes a long time to find the address ([email protected]) and you do get a nice note back saying eBay personnel will look into the email and let you know what happens along with a bunch of instructions that suggest you shouldn't expect much. I still wouldn't use eBay and I don't see any incentive to reporting the fraudulent mail other than it might do some good.

Later you get another email inviting you to take online training but provides no incentive whatsoever to undertake it. Imagine if your IT shop suggested training but no one required it -- I expect most would decline to take that training as well.

If I'm using eBay, eBay should remind me regularly how much pain I'll be in for getting tricked by one of those emails. There should be a reward of some kind for reporting the emails if they result in the arrest of a criminal, and that reward should be non-trivial. The address to report a problem and details of the proposed incentive program should be right on the home page or, at most, one click away behind a distinctive heading. And the service should actively market the program to make people aware of the benefits, both in terms of safety and the reward associated with participation.

I think eBay should send out bogus test emails, alert users when they’ve been tricked, and remind them what to look for to keep it from happening again. In addition they should provide links to resources so victims can protect their assets if they fell for any similar email allegedly from eBay (high probability here).

Finally, regular members ought to get a detailed account of security problems that were reported and had material consequences. Something like "Bill got a bogus email and responded to it... He'll be spending the next nine to 15 months rebuilding his life and trying to recoup $100,000 of his time and resources. This could have been prevented if..."

This provides incentives, penalties (inherent), and a more convenient way to report the problems in the first place. My impression is that eBay now simply goes through the motions so it doesn't look negligent. But since its efforts are inadequate, I doubt it will be able to avoid blame for long. More importantly, there are probably an increasing number of potential customers who, like me, avoid eBay like the plague.

Like any battle, security requires a commitment to understand the tools that must be used, as well as how the bad guys think and behave. People are the most powerful tool on either side of the equation. If we don't spend time learning how to make our tools work for us, criminals certainly will be more than happy to spend the time making them work against us.

— Rob Enderle is President and Founder of Enderle Group . Special to Dark Reading

About the Author(s)

Dark Reading Staff

Dark Reading

Dark Reading is a leading cybersecurity media site.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights