Creating a Security Culture Where People Can Admit Mistakes
In cybersecurity, user error is the symptom, not the disease. A healthy culture acknowledges and addresses the underlying causes of lapses.
April 11, 2022
![Photo of various people's hands building an archway together out of wooden blocks Photo of various people's hands building an archway together out of wooden blocks](https://eu-images.contentstack.com/v3/assets/blt6d90778a997de1cd/blt5707c533d8660690/64f175c7e4356bf70fefb969/wooden-blocks-build-Andrey-Popov-AdobeStock.jpeg?width=700&auto=webp&quality=80&disable=upscale)
Source: Andrey Popov via Adobe Stock
Andy Ellis, advisory CISO for Orca Security and a longtime Akamai veteran, likes to tell a story about a potentially serious security incident. One of his team members was testing the email integration of a new incident tracking system. Unfortunately, the test email, titled "[TEST] Meteor strike destroys the headquarters," went to everyone in the company and created a loop that crashed the mail servers.
As Ellis recounts, "The next day the responsible employee tweeted a picture of themselves training for a 5K run, and I replied, 'Preparing to outrun the meteor?'"
The serious lesson from that is to acknowledge but forgive errors. "He's said, many times, that he knew at that moment it was going to be OK," Ellis says. "Creating a safe culture requires a lot of practices, and one of them is closure. Humor is a great way to provide closure because you rarely laugh about something that is still creating tension."
There isn't a lot to laugh about in cybersecurity, with security teams fighting off a growing number of cyberattacks and deploying protective measures for a fast-evolving environment. But security shouldn't be about browbeating people into doing the right thing or scaring them with the prospect of punishment. For security to be a team sport, you need to make people want to play.
It's vitally important to your business to create a security culture — that is, an atmosphere in which someone who messes up and breaks something feels they can report it without getting blasted for their actions. This idea isn't new, but considering recent analysis about how some companies aren't backing up their source code, sometimes stories need to be repeated. Here's how to build an organization that encourages people to admit their mistakes.
A positive security culture is defined by an atmosphere in which people feel safe to admit when they make a mistake, and the foundation for that is to make it harder for people to make mistakes. To do so, it's helpful to examine the different kinds of errors and how to address them.
According to user experience shop Nielsen Norman Group, there are two types of errors: slips and mistakes. A slip is like a typo, where the error occurs because of user inattention. This is what happens when, for example, a person types the wrong command into a shell script and deletes instead of reads a file. Some level of slips are unavoidable, but you can reduce them by addressing interface issues: making it easier to generate and manage secure passwords or setting the OK button further away from the reset button, for example.
Mistakes are what happen when the user misunderstands the goal and how to accomplish it — you might go through the correct steps, but you won't reach your goal because you're on the wrong path. A classic, and particularly high-stakes, example is a medical error, where someone gives a patient the wrong medication because the labels are too similar or the vial is stored in the wrong spot. This is usually what people mean when they say "user error."
But we can look at this type of error as a design error, rather than a user error, and start to learn how to head them off.
"Practice saying that 'human error is a symptom of a system in need of redesign,' which I learned from [MIT professor] Nancy Leveson," says Ellis. "Once you accept the truth of the statement, you can start to see deeper problems and really start to learn from incidents. An employee got phished? You can realize that the broken system is how dangerous email clients, browsers, and network authentication are."
The idea that human error is the symptom, not the disease, is prominent in Leveson's work. She's a professor of aeronautics and astronautics, where small mistakes can cost millions of dollars — and human lives. Her work as director of the MIT Partnership for Systems Approaches to Safety and Security ripples outside of aeronautics and into less physical fields like cybersecurity, as Ellis' remarks show.
"For far too long, cybersecurity has been perceived as purely a technical challenge. Organizations and leaders are now realizing that we also have to address the human side of cybersecurity management," says Lance Spitzner, director of research and community at SANS, in the introduction to his course "Leading Cybersecurity Change: Building a Security-Based Culture."
Dr. Jessica Barker, co-CEO of Cygenta and author of the book "Confident Cyber Security," cites Sidney Dekker's work on "just culture" as foundational to tech's implementation of human-focused security. Within cybersecurity, for example, just culture might involve putting strong identity and access management (IAM) and multifactor authentication (MFA) in place rather than dinging workers for compromised passwords or authentication failures.
"When a culture is retributive, there is a focus on individuals seeking to place blame and administer punishment. People often become a scapegoat for systemic problems," says Barker. "A restorative just culture, on the other hand, looks at the deeper conditions that facilitated the incident and is more forward-looking while still holding people to account as appropriate.
"A restorative just culture recognizes that emphasizing individual blame and punishment does not reduce the likelihood of incidents, it simply reduces the likelihood of people reporting incidents and therefore undermines opportunities to identify systemic issues and learn from them."
To borrow from the life-or-death medical world, the University of Utah's School of Health shares three principles for building a safety culture:
Stuff happens. Acknowledge and report errors — we all make mistakes.
No-blame. Support a no-blame culture by speaking up and encouraging others to raise concerns.
Continuously improve. Commit to process-driven learning and prevention.
Brian Wrozek, CISO at Optiv Security, says the greatest roadblock to building a security culture is "time and effort."
"Organizations that are serious about it make building that culture a priority. It doesn't happen by accident," he adds. "Far too often security and IT professionals assume employees know better or that they'll know how to act on or report suspicious behavior. It's important to clearly state why certain procedures are in place, in addition to the how to follow them."
Training in security practices is a vital part of building a security culture, according to Wrozek.
"Awareness and training sessions need to happen often," he says. "Organizations can institutionalize a healthier security culture by conducting tabletop exercises to ensure employees receive hands-on practice in responding to different scenarios."
Optiv Security's Wrozek shares his own positive experience with security culture early in his career. He was screening people at a shareholders meeting, checking to make sure they had the card that granted them permission to enter. But one member of corporate leadership approached without their card.
"I was torn. Do I ask them for their card and risk getting fired ... or do I just let them in assuming they're allowed?" Wrozek says. "Well, I politely asked the person for their card and was pleasantly surprised when the person said, 'I don't have it because I left it on my desk. You're right, though. I'll go back and get it. Good work.' When leadership follows the rules in view of others and praises those for following proper procedures even when it might be uncomfortable, it sends a powerful message to the rest of the company."
That resonates with me. Back in 1996, I accidentally deleted the splash page for The Site, the website for the MSNBC television show of the same name. Visiting the main site URL would produce a 404 error. I had a Unix server at home, so I was just playing around at my new job and exploring the folder system — and I hit the wrong key. I jumped up and ran to the webmaster, who restored it in moments from a backup. Somehow she didn't panic, and I never heard any blowback about the incident. It was a case study both in things done right (good backups, good working atmosphere) and wrong (overly permissive role permissions, me being a dope).
"Security teams and security businesses frequently operate by appealing to fear, uncertainty, and doubt (FUD). This creates alarm, fosters distrust, and ultimately undermines the goal of empowering people," says Kim Burton, head of trust and compliance at Tessian. "Security teams can encourage vulnerable sharing by becoming trusted partners, by leaving their ego at the door, and centering the needs of their co-workers. A security team's actions need to mirror the desired culture. People don't need to be afraid to take appropriate actions — they just need a compassionate ally."
One of the most important functions for a security culture to implement is a system for people to report incidents. According to the National Societies of Sciences, Engineering, and Medicine, there are two types of reporting systems: mandatory and voluntary.
"Reporting systems have the potential to serve two important functions. They can hold providers accountable for performance or, alternatively, they can provide information that leads to improved safety," the National Societies explains in its publication "To Err Is Human: Building a Safer Health System." While mandatory reporting systems catalog errors in order to assign responsibility (often to meet regulatory or shareholder needs), voluntary systems are usually confidential, meant to catalog errors that didn't result in serious harm; "near misses," where someone picked up the wrong medicine vial but realized it right away; or in cybersecurity, when a user received a suspicious message in Outlook but reported it with the Report Phish button.
"How a company reacts to someone reporting an issue or incident is really telling: If there is an emphasis on punishment over identifying and addressing the root causes, then other people won't feel safe reporting issues in future," Cygenta's Barker warns. "But if people are treated with fairness and compassion when they report an incident, this helps build a culture of trust and psychological safety in which individuals are going to feel more comfortable speaking up."
Optiv Security's Wrozek shares his list of elements for a good reporting system:
Documented and transparent: "It's important to have documented and transparent processes on how internal investigations and incident responses are handled. These are potentially serious events, and it's good to know they will be handled professionally and in a fair, repeatable manner."
Widely based: "Include others in the process [security, HR, legal, ethics] to offer different perspectives and avoid any appearance of randomness, favoritism, or retaliation when determining consequences."
Confidential: "Be sure to protect the confidentiality of the person."
Timely: "Share internal use cases and success stories highlighting how the reporting prevented the damage from being worse."
Convenient: "Lastly, make it easy to report incidents and offer multiple avenues [email, voice, anonymous]."
"We address reporting quite a bit in Mimecast's awareness training modules. The sooner security professionals become aware of a potential problem, the sooner they can start mitigating it," says Jann Yogman, senior director of product management at Mimecast. "Mimecast's new Executive Training series talks specifically about how leaders can create an environment where employers feel empowered to report incidents, admit mistakes, and verify requests when they're not sure if they're real."
Cygenta's Barker cites a client who encouraged reporting of even serious mistakes by switching how they reacted to participants' errors in a phishing exercise.
"Instead of issuing people warnings for clicking simulated phishes, they now reward and celebrate those who report the most and who report the quickest," she says. "By focusing on positive reinforcement of the desired behavior, this organization has seen a huge rise in their report rate."
Orca Security's Ellis agrees, and he adds that reporting shouldn't be punished indirectly either.
"A key element of a safety culture is thanking people who do report their own 'errors' and not saddling them with more work," he says. "Many organizations respond to these moments, unfortunately, by adding onerous processes that don't increase safety but do increase effort. These are often viewed as punitive measures, which make your people less likely to identify and report problems."
Organizations should instead take the opposite approach, he advises.
"Leaders need to vocally and visibly refuse to punish people for mistakes, thank them for identifying problems in the system, and seek to understand the complex hazards that lead to safety and incidents, avoiding simple 'blame the user' analysis," Ellis says.
Adds Tessian's Burton: "Most people are working hard to do what they believe is best for their work and the business; security teams should operate with the belief that their co-workers are trying their best and empathize with the problems their co-workers are working to solve. Proactive and sustained communication through routine check-ins, cross-team collaboration, and a focus on outcomes can help with developing empathy and respect for each other's work."
It's a two-way street, Mimecast's Yogman says.
"The message needs to start at the top. And employees need to understand the role they play in keeping the company safe," he says. "Our training, geared toward regular employees and now company leaders, can help create a culture where reporting becomes the norm."
Incorporating learnings from error reporting is vital, says Optiv Security's Wrozek.
"Everyone makes mistakes, but a company never wants to compound it by trying to cover it up," he says. "By reporting it quickly, you give security and IT and, depending on the incident, legal an opportunity to minimize the impact. In some cases, it may be a reportable event [industry regulation or law], and failing to take timely action or report it makes the situation worse for the entire company."
Cygenta's Barker notes that trust is built when employees see that a company identifies and addresses the root cause of an error. She cites an example of a user whose error caused them to expose data internally. Rather than jumping on the symptom — one person oversharing data — the company sought to find the cause and changed the process that allowed this oversharing to happen in the first place.
"The person involved recognized that they had made a mistake and learned from it, but most importantly the organization learned from it," she says. "A broken process was identified and fixed, and people also learned that they can and should report incidents without fear. "It is vital that this is led from the top, from those who are most influential and those who direct the priorities of an organization. The behavior modeled by leadership is the behavior that others in an organization will follow."
Incorporating learnings from error reporting is vital, says Optiv Security's Wrozek.
"Everyone makes mistakes, but a company never wants to compound it by trying to cover it up," he says. "By reporting it quickly, you give security and IT and, depending on the incident, legal an opportunity to minimize the impact. In some cases, it may be a reportable event [industry regulation or law], and failing to take timely action or report it makes the situation worse for the entire company."
Cygenta's Barker notes that trust is built when employees see that a company identifies and addresses the root cause of an error. She cites an example of a user whose error caused them to expose data internally. Rather than jumping on the symptom — one person oversharing data — the company sought to find the cause and changed the process that allowed this oversharing to happen in the first place.
"The person involved recognized that they had made a mistake and learned from it, but most importantly the organization learned from it," she says. "A broken process was identified and fixed, and people also learned that they can and should report incidents without fear. "It is vital that this is led from the top, from those who are most influential and those who direct the priorities of an organization. The behavior modeled by leadership is the behavior that others in an organization will follow."
Andy Ellis, advisory CISO for Orca Security and a longtime Akamai veteran, likes to tell a story about a potentially serious security incident. One of his team members was testing the email integration of a new incident tracking system. Unfortunately, the test email, titled "[TEST] Meteor strike destroys the headquarters," went to everyone in the company and created a loop that crashed the mail servers.
As Ellis recounts, "The next day the responsible employee tweeted a picture of themselves training for a 5K run, and I replied, 'Preparing to outrun the meteor?'"
The serious lesson from that is to acknowledge but forgive errors. "He's said, many times, that he knew at that moment it was going to be OK," Ellis says. "Creating a safe culture requires a lot of practices, and one of them is closure. Humor is a great way to provide closure because you rarely laugh about something that is still creating tension."
There isn't a lot to laugh about in cybersecurity, with security teams fighting off a growing number of cyberattacks and deploying protective measures for a fast-evolving environment. But security shouldn't be about browbeating people into doing the right thing or scaring them with the prospect of punishment. For security to be a team sport, you need to make people want to play.
It's vitally important to your business to create a security culture — that is, an atmosphere in which someone who messes up and breaks something feels they can report it without getting blasted for their actions. This idea isn't new, but considering recent analysis about how some companies aren't backing up their source code, sometimes stories need to be repeated. Here's how to build an organization that encourages people to admit their mistakes.
About the Author(s)
You May Also Like
CISO Perspectives: How to make AI an Accelerator, Not a Blocker
August 20, 2024Securing Your Cloud Assets
August 27, 2024