3 Ways Behavioral Economics Obstructs Cybersecurity
People are not robots; their decisions are based on emotion as much as data. Often, this can lead them to make mistakes with serious security implications for the business.
In the workplace, it is easy to forget our humanity. In business, people discuss serious topics like the bottom line and strategic planning, therefore it is assumed everyone is driven purely by data and making rational choices based on evidence. In a business setting, people are considered rational actors, operating with self-control, always making optimal decisions.
Behavioral economics conflicts with this belief. It argues that humans are subject to emotions and rampant impulsivity — even in business. This theory states that we are still humans at work — and that our circumstances and environments influence and lead us to make irrational decisions more often than not. This means even when we have all the data, we don't always follow economic model predictions or, put simply, do what we "should" do.
3 Ways Security Is Impacted by Behavioral Economics
There is a human element behind people's decisions, yet in business, emotions are often ignored in favor of big data. But they exist despite how much we resist; we are unpredictable and biased despite our best efforts.
Security is an area significantly impacted by behavioral economics. Since cybersecurity is a high-pressure field filled with ongoing incident management, behavioral economics theories can hamper security programs and throw risk-management road maps off course if security professionals aren't careful.
Mental Accounting
Mental accounting is a vein of behavioral economics that argues individuals think about money differently depending on circumstances. Irrational decision-making occurs when people place different values on money depending on their environment or the framing of the topic.
Mental accounting impacts cybersecurity because it can be onerous to obtain budget for risks that haven't materialized. If you're pressing for funding to purchase an incident response retainer, other leaders' mental accounting might discount the need because the risk is not present nor ongoing.
Mental accounting might lead finance and other leaders to ask: "Why pay for something that might happen?" Cybersecurity leaders know planning is critical to protecting the business, and not purchasing appropriate tools will cause pain if a breach or security incident happens. As IBM reports (registration required), the average cost of a breach is $4.45 million, therefore, security leaders must frame their budgetary needs effectively to protect the business and to ensure they obtain adequate funds for breach response.
Sunk Cost Fallacies
The sunk cost fallacy can occur when cybersecurity professionals become too attached to their security road map rather than letting it be dynamic. This fallacy argues people continue to invest in losing projects because they have invested significant time or resources. When you develop a multiyear security road map, it is easy to become attached to it due to loss aversion.
While delivering on a road map is critical from a security perspective, it's also essential to be open to shifts in the outline or original goals. University of Maryland researchers found hackers make cyberattack attempts every 39 seconds: clear evidence that security approaches and programs must adapt due to the frequency of attacks. Leaders cannot become so attached to their initial road map that they refuse to adapt it to prioritize emerging threats.
Availability Heuristics
Availability heuristic theory states that people often rely on quickly recalled information instead of data when evaluating a particular situation or outcome.
This theory is evident in the skyrocketing success of social engineering. Employees are moving fast and often operate on autopilot. When they receive a "seemingly" safe link, distracted or overworked employees may not immediately perceive it as suspicious. Many phishing attempts look legitimate, and if an employee is inadvertently relying on the availability heuristic by recalling recent information, they may not recognize a fraudulent attempt.
People make decisions based on shortcuts. Even if they have all the data accessible, that does not guarantee they comb through it extensively to arrive at the "right" decision. No matter how well-trained on social engineering people are, heavy workloads and a busy 9-5 schedule means relying on availability heuristics is inevitable. This also means anyone can easily fall victim to a phishing attempt.
Recognizing Behavioral Economics in Cybersecurity
It's evident there is no way to avoid behavioral economics in cybersecurity; regardless of how much data people have, innate humanity still impacts them. Not only are security professionals affected by other departments' behavioral economics, they are also at risk of falling victim. Fortunately, simply being aware that people are not robots that always make logical and calculated decisions can help limit behavioral economics' negative impact.
Having visibility into how emotions impact work can enable cybersecurity professionals to more effectively drive security forward. Understanding availability heuristics, sunk cost fallacies, and mental accounting can help better frame security decisions as positive impacts to the bottom line. The more we understand behavioral economics, the more effectively we can present security-related investments and decisions as wins for profitability.
About the Author
You May Also Like
DevSecOps/AWS
Oct 17, 2024Social Engineering: New Tricks, New Threats, New Defenses
Oct 23, 202410 Emerging Vulnerabilities Every Enterprise Should Know
Oct 30, 2024Simplify Data Security with Automation
Oct 31, 2024