The industry can only go so far in treating security as a challenge that can be resolved only by engineering.

Lysa Myers, Security Researcher, ESET

December 5, 2019

4 Min Read

In the early days of computing and connected devices, there was a lot we didn't yet know about designing secure products and environments. Today, there are established, well-known frameworks and lots of advice to help people protect data and devices in their care for everyone from home users to CISOs of Fortune 500 companies.

So, why is it that good security practices are rarely adopted at every level of interaction with technology? It's because we still view the issue as a technology not a people problem. Consider these four human factors that prevent the security industry from moving towards a better future.

Human Factor 1: Usability and Accessibility
There's a kind of inertia that's created by the usability patterns that are baked into popular software (including operating systems), which keeps people from choosing the most secure option because they are designed to make us flow from one app to another naturally and almost without thought. These user-friendly designs do not encourage people to be cautious or wary.

What's worse is the fact that the steps we can and should take to protect ourselves are, more often than not, designed to interrupt this flow. While this is not necessarily a bad thing, our industry still needs to understand why people are practicing poor online hygiene. It is already a Sisyphean task to make things more secure; making things less secure is like rolling that same boulder downhill. This effect is magnified for those with different accessibility requirements, such as people with vision impairment.

Human Factor 2: Cybersecurity Skills
There are many reasons that companies are having a difficult time hiring and retaining people in cybersecurity roles, starting with the widespread assumption that this is a career path suitable only for people who've been immersed in coding and mathematics since the time they could reach a keyboard.

There's also a collective perception that security people can be incredibly hostile and antisocial, especially toward newcomers. Those who decide to seek a career in infosec often find that an entry-level job requires that they already have work experience. Too often, people who actually make it into the industry (especially those from underrepresented groups) leave midcareer due to burnout, an unsupportive culture, or an ill-defined career path.

Human Factor 3: Solutions in Search of a Problem
Technological advances are typically approached as if they're all unquestionably good. We often fail to even ask whether there are downsides to these innovations, much less whether we can mitigate the damage after the fact. At the very least, we should all assume that any given product or service will eventually be misused, no matter how beneficial its original intent.

Human Factor 4: One Size Does Not Fit All
If you've ever gone to battle with your IT department over a policy that treats all employees as if their job functions were identical, you'll understand how frustrating such a cookie-cutter approach can be. Asking people to mold their life or job circumstances to fit a security policy is simply unrealistic. Doing so is a recipe for reduced productivity, and may strongly contribute to employee burnout.

Human Factor 5: Broadening Our Experience and Knowledge Base
The good news is that human problems are neither new nor unique to tech. There are entire industries that focus on studying human behavior, and there are people who specialize in the concerns of marginalized or vulnerable populations. Ideally, we should all be hiring people from these populations. But hiring challenges sometimes mean that there is work to be done on improving company culture, which experts can help with. For example, our industry has a long history of partnering with law enforcement. We should also be working with people specializing in industrial/organizational and educational psychology, as well as social workers and ethicists.

The security industry can only go so far in treating security as a problem that can be solved by engineering alone. Until we couple technology with a better understanding of the humans who are using technology insecurely, there's a limit on how much progress we can ultimately make.

Related Content:

Check out The Edge, Dark Reading's new section for features, threat data, and in-depth perspectives. Today's top story: "Home Safe: 20 Cybersecurity Tips for Your Remote Workers."

About the Author(s)

Lysa Myers

Security Researcher, ESET

Richard Roth leads Dignity Health's innovation efforts, which seek to create and test novel services, programs, partnerships, and technologies – from within and outside of healthcare – that challenge the status quo and have the potential to reduce the cost of care, improve quality, and/or increase access to services. Working in concert with Dignity Health employees and physicians, he works to anticipate emerging trends and technologies with the goal of incubating, studying, and scaling efforts to improve care. He led Dignity Health's efforts in forming SharedClarity, a novel new startup focused on creating transparency into medical device performance in an effort to improve patient outcomes and lower the cost of care. Roth holds a Master's degree in healthcare administration from the University of Minnesota and a Bachelor's degree in public health from West Chester University.  

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights