Our current approach to cybersecurity awareness is broken. Often, security leaders treat employees as the weakest link in an organization's security chain, yet they should be viewed as an organization's strongest asset. While it's no secret that the top reasons for cyber breaches point to human — or end-user — actions, our approach to educating employees is largely flawed.
Despite the prevalence of employee-centric attack methods, less than 5% of an organization's security budget is spent on its people. That small portion of the budget is spent on outdated traditional security awareness models that rely on simulated attacks designed to demonstrate how susceptible workers are — and accomplish little beyond embarrassing workers and positioning security teams as antagonists rather than allies. Very rarely do these measures result in positive user behavior changes, which is the ultimate goal of these tactics.
It's clear that traditional approaches to cybersecurity training have failed. From mistakenly disclosing account information to falling for phishing attacks, time and time again, an organization's sensitive data often leaks through legitimate channels with a worker's unknowing help — demonstrating that cybersecurity is increasingly a behavioral challenge. Instead of clinging onto measures that have repeatedly proven to be ineffective at safeguarding organizations, security leaders must redesign cybersecurity awareness with the human mind at the forefront. For that, we must turn to basic principles of psychology so we can better understand human behavior — and how we can positively influence it.
The Role of Cognitive Bias
In the 1970s, cognitive psychologists Daniel Kahneman and Amos Tversky discovered that, in certain circumstances, humans show a systematic deviation from norm or rationality in judgment called cognitive biases. These errors in our thinking process are used to expedite and simplify information processing. As such, they also affect decision-making and have been shown to impact organizations' cybersecurity.
SecurityAdvisor recently assessed more than 500,000 malicious emails to better understand the cognitive biases used by malicious entities to target enterprise employees. Authority bias, in which a cybercriminal pretends to be a person of authority in the user's organization (e.g., a CEO), was one of the top five cognitive biases used in spam and phishing schemes. By improving employees' understanding of biases, it becomes easier to identify and mitigate the effect of flawed reasoning. With authority bias, for example, organizations can design security awareness training that coaches employees not to rush to fulfill management requests but report spam instead.
While it's nearly impossible to unlearn these biases, we can improve our employees' understanding of cognitive biases to make it easier to identify and mitigate the impact of psychologically powered cyberattacks — and ultimately facilitate changes in individual cybersecurity behavior.
Repeating Behaviors, "Nudging" Employees
Knowing that cognitive biases consistently lead to risky behaviors, security leaders must ensure that their organization's employees are routinely educated to identify attacks. The most effective way to increase understanding of these biases is rooted in the psychology of associative learning. Every behavior triggers thousands of neurons, which form a neural network that helps us learn, store, and recall information in an effective way. When a behavior is repeated over and over, our brains learn to trigger the same neurons each time. All biases are created through this basic learning mechanism — and is also how we can unlearn these behaviors.
Long-term behavior change is derived from consistent training and engagement, a phenomenon that psychologist Hermann Ebbinghaus discovered. The average person, therefore, requires constant reminders to apply knowledge at the right moment. Organizations should constantly communicate educational content on new and emerging threats in real-time to simultaneously improve employees' understanding of their inherent biases and prevent breaches.
In fact, the work of Nobel Prize winner and behavioral economist Richard Thaler shows that decision architecture and human behavior can be influenced by "subtle nudges." Based on indirect encouragement and enablement, the nudge theory offers curated choices that encourage people to make positive and helpful decisions — reshaping existing behaviors and counteracting innate human cognitive biases.
In some instances, nudge theory is already being used effectively to help humans adopt more secure behaviors. One of the most common and best examples is the use of a password strength meter. As someone is asked to create a password, the more a bar changes from red to yellow to green as the password becomes longer and more complex. For organizations looking to educate employees about online threats and make a positive change in their behavior, these nudges might appear in the form of emails that alert employees that the email they just opened might contain a malicious link, and include a bite-sized video or flier with tips on steps they can take to ensure the link is safe before they click on it.
As they currently exist, traditional security awareness efforts are not enough to change individual behaviors. They are too impersonal and too infrequent to impart any lasting change in employee behavior.
Building a deep understanding of human behavior — and applying pieces of training consistently over time — does a far better job at educating employees about possible cybersecurity risks than traditional security awareness seminars could ever do. To bring about real reductions in human cyber-risk, organizations must turn to psychology to redesign their awareness programs.
Approaching security awareness that works naturally with the human mind is the first step in making a fundamental change in individual behavior and, ultimately, bolstering an entire organization's security posture. And it might just be the key to making our people our most powerful security asset.