Experts discuss the meaning of action bias and how it presents a threat to IT security leaders, practitioners, and users.

Kelly Sheridan, Former Senior Editor, Dark Reading

June 28, 2021

5 Min Read

When a data breach hits, the best response is to act quickly and forcefully … right?

Not necessarily, experts say. The impulse for cybersecurity pros to have control over a situation is common — after all, you don't want to be the CISO who didn't act after learning about an attack — but hastily made decisions may do more harm than good or create a problem where one didn't exist.

Action bias, a subset of cognitive bias, describes the human tendency to favor action over inaction. This is the perceived urgency that tells you, "Don't just stand there, do something!" even when there's no evidence that action will be helpful. Cybersecurity practitioners, along with healthcare workers and air-traffic controllers, are among the professionals who face this bias on a regular basis.

There are a couple of reasons why action bias strikes, says Doug Hough, senior associate at the Johns Hopkins University Bloomberg School of Public Health. Sometimes the motivation is to show leadership or demonstrate value. Action bias also preempts second guessing. Perhaps you didn't do the right thing by acting, but at least you did something, as opposed to doing nothing.

In some cases, though, it's better to take time in reacting to a situation rather than jumping in with both feet. Hough uses the example of a soccer goalie: Sometimes the best move for a goalie is to stand in the middle of a goal, not jump to one side or the other. What looks like inaction is a deliberate choice; the goalie is being strategic in waiting to see where the ball goes.

The same applies in the professional realm, where slowing down responses can lead to a better outcome. "Sometimes it's better to let issues and problems percolate a bit … and understand them enough, so that you can really attack the problems intelligently and efficiently," Hough explains.

Josiah Dykstra, technical fellow at the National Security Agency's Cybersecurity Collaboration Center, uses phishing as an example. An employee who receives a phishing email that demands they take action will feel compelled to act, whether that means clicking a malicious link or downloading an attacker's attachment. The urgency they feel leads them to make the wrong choice.

"When instead, if they slowed down and thought more carefully about what to do, they might not become a victim in the same way," he says. "That impulse to get control shows up in all kinds of situations, whether we are users, cybersecurity defenders, or leaders."

Ransomware is one scenario in which these three groups act very differently, Dykstra continues. If an employee's machine is infected with ransomware, their immediate reaction is to try and get their data back. They aren't aware of the many security mechanisms put in place that were meant to stop this problem.

Defenders have a different view. They want to jump into action right away to figure out how this problem occurred so they can fix it. Those in security leadership have another reaction. From their perspective, the rational thing is to say, "This can never happen again; we can never allow ransomware." They're willing to spend more money and allocate more budget to ensure there isn't a repeat attack.

"From my perspective as a user or a cybersecurity defender, that seems kind of crazy; it seems like overkill," Dykstra says. "But my view of the problem is quite different than the user's or the executive leadership's."

The attitude of "this can never happen again" can make action bias worse, Hough adds. The business feels they have to do something — anything — to prevent another incident, and if/when that incident does happen, they feel the pressure to respond instantaneously. But without a process to address the situation, "act fast" could cause long-term damage to organizations and their employees.

Long-Term Effects & Proactive Steps
Just as it plays a role in responding to attacks, action bias plays a role in the stress, burnout, and mental fatigue affecting security practitioners, the experts say.

From a defender's perspective, the attacks never stop, says Dykstra. Organizations have set impossible goals, such as "don't allow any bad things to happen," which puts continual pressure on employees to achieve goals that often can't be met. The buildup of these situations is contributing to stress that will continue to increase as the crises will never stop.

"Hackers are incentivized to keep trying," he adds. "They will keep coming over and over again, but we don't need to let that lead to burnout. We can help build resilience in the people, and resilience in the processes that we have in our organizations, so it isn't so stressful in those situations — they know what to do; they've done it before."

Both Hough and Dykstra agree security teams can do more to prepare for attacks so the immediate reactive choices are more routine. Creating resilience in people and processes can lessen the stress when attacks hit and help employees, practitioners, and leaders act quickly with confidence.

They advise conducting tabletop exercises and red teaming to create and practice a routine for responding to an attack. Going through the exercise and building this process will help identify who must be in the room to solve certain problems, what everyone's individual roles will be, and how they will work together to react to a crisis. The idea is not to sit still and let bad situations unfold but to give the situation forethought and prepare to respond more appropriately.

"In a sense, you're developing a process, not an answer," Hough says. "It's not that you'll develop the answer when every single thing comes along, but you have a structure of how you should proceed, which would then enable you to avoid, or at least ameliorate, action bias."

Hough and Dykstra will discuss the effects of action bias, its effect on the cybersecurity industry, and how to address it in an upcoming Black Hat talk entitled "Action Bias and the Two Most Dangerous Words in Cybersecurity."

About the Author(s)

Kelly Sheridan

Former Senior Editor, Dark Reading

Kelly Sheridan was formerly a Staff Editor at Dark Reading, where she focused on cybersecurity news and analysis. She is a business technology journalist who previously reported for InformationWeek, where she covered Microsoft, and Insurance & Technology, where she covered financial services. Sheridan earned her BA in English at Villanova University. You can follow her on Twitter @kellymsheridan.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights