Proactively addressing your biases can help you build a resilient and adaptable security foundation.

Lynda Grindstaff, Senior Director of the Innovation Pipeline, Intel Security

December 15, 2016

4 Min Read

As we move from one year to the next, it is valuable to reflect on what has changed and what hasn’t in our areas of interest. In cybersecurity, there are two notable things that have not changed over the past year, and one that has changed significantly.

The two issues that have not changed much are the ongoing scarcity of experienced security personnel and the somewhat related issue of little diversity in the security workforce in most organizations, especially the low number of women.

Where cybersecurity has changed a lot in the past year is the rate of innovation -- by organizations and their adversaries -- as they strive to gain an advantage.

There have been a fair number of blogs, articles, and research papers on these topics, and my goal is not to rehash those. Instead, I’d like to explore these three items from the perspective of unconscious bias -- the quick decisions that we make automatically and often without real awareness.

On the scarcity of experienced security personnel, many people I’ve spoken with have an unconscious bias toward hiring people with undergraduate degrees in security and/or various security certifications. But there are lots of other qualified individuals out there, whether they are coming from non-degree programs or lack a security certificate -- many of whom may already be working at your company. Consider organizing hacking contests or using video games that contain a realistic hacking component to identify potential candidates and reduce this bias.

Another unconscious bias that can affect the security workforce shortage centers on automation. Again, many people I’ve spoken with are concerned about letting machines make decisions such as blocking access, killing processes, or deleting files, which are tedious but critical components of any set of cyber defenses. It is time to actively work to counter this bias. Automation of tedious and repetitive tasks and that supports and augments the human security team is essential to dealing with the volume of attacks, alerts, and cleanup activities that most organizations deal with every day.

On the issue of women in the cybersecurity workforce, this is a longer-term project as it requires engaging more women and girls in security and technology concepts, training them, recruiting them, and keeping them. This can feel like a catch-22, as women sometimes look for jobs and environments that already have a reasonable percentage of women. However, it is also important to look at the work environment around you and make the necessary changes to attract and retain women. Sometimes the behavior of a group unintentionally excludes others, whether it is due to common topics of conversation, team-building activities, or after-work gatherings.

Finally, and possibly most dangerous, is the issue of unconscious bias and innovation. Studies repeatedly show that diverse groups are a bit more challenging to work in, but come up with better and more innovative solutions. Attackers are continually benefiting from this diversity, sharing and trading tips and code across national boundaries, among criminals and nation-state actors and others that have an interest in the technology. For example, a recent report on cyberattacks targeting the healthcare industry includes examples of attackers looking for partners, helping each other through some technical difficulties, and offering congratulations and a bit of envy after the theft of some medical records.

Adversaries have found new and creative ways to attack over the past year, including significant innovation in ransomware and DDoS attacks built on thousands of compromised webcams. Does your organization assume it may not be affected because it is located in a different country from where its suppliers and customers operate? Have you considered the impact of new devices and apps that are popular with your employees or consumers but not used by everyone on the team? Do you discount comments from younger employees because they have less experience? Any one of these things is an example of an unconscious bias that can increase the risk to your organization.

Our predictions for 2017 highlight another active year for cybersecurity. Proactively addressing your biases can help you build a resilient and adaptable security foundation that can more effectively detect, protect, and correct threats that are known, as well as those that haven’t even been invented yet. 

About the Author(s)

Lynda Grindstaff

Senior Director of the Innovation Pipeline, Intel Security

Lynda Grindstaff creates the future for Intel Security as the Senior Director of the Innovation Pipeline. In this role, Lynda leads a global team that brings the future to life for Intel Security through innovative strategies and prototypes. Her tenure with Intel spans two decades and includes numerous technical and leadership positions such as business client strategist, innovation marketing manager, system software developer, chipset validation, and management of a global technical marketing team based in the US and India. A respected expert in her field, she has two patents and won the Intel Achievement Award, the Intel Software Quality Award, and the Society of Women Engineers Emerging Leader and Fellow Awards. Lynda holds a BS in computer science and is a valued industry conference speaker. She has a passion for coaching, growing, and developing technical leaders and remains active in community outreach programs for the Society of Women Engineers, National Center for Women & Information Technology, and the Women at Intel Network.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights