5/2/2018
12:45 PM
Connect Directly
Twitter
RSS
E-Mail

Automation Exacerbates Cybersecurity Skills Gap

Three out of four security pros say the more automated AI products they bring in, the harder it is to find trained staff to run the tools.



As the security industry grapples with the consequences of a constrained supply of experienced cybersecurity talent, many pundits have lauded automation as a way out. But a new survey out today shows that many security professionals are experiencing the opposite effect. The more artificial intelligence (AI)- and machine learning-powered tools they bring in, the more they need experienced staff to deal with those tools. 

Conducted by Ponemon Institute on behalf of DomainTools, the study queried over 600 US cybersecurity professionals on the effects of automation on their staffing situations. The results offered up are counterintuitive to general belief that automation will ameliorate the cybersecurity skills gap.

According to the study, 75% of organizations report that their security team is currently understaffed and the same proportion say they have difficulty attracting qualified candidates. Over four in 10 organizations report that the difficulties they've faced with recruiting and retaining employees has led to increased investment in cybersecurity automation tools. However, 76% of respondents report that machine learning and AI tools and services aggravate the problem because they increase the need for more highly skilled IT security staff. And only 15% of organizations report that AI is a dependable and trusted security tool for their organization.

This jibes with what a lot of experienced security practitioners have to say about automation. 

"It is very tempting to think that automation will fix a lot of cybersecurity issues. However, automation mechanisms are worthless without a staff which can smartly leverage them and implement them," says Frank Downs, senior manager of Cyber Information Security Practices at ISACA. "An organization can purchase the most incredible intrusion detection/prevention system in the world. However, if they don't have the staff to configure, implement, and manage it — it might as well stay uninstalled." 

That's not to say that there's no value in automation, it's just that the same principle of "GIGO" applies for cybersecurity automation as it does for any other technical system.  

"Automation really helps make the people on the team more effective. There's no substitute for human flexibility and intuition, so automation lets you take repetitive tasks off the table and enables people do more interesting work," explains Todd Inskeep, principal for Booz Allen Hamilton and advisory board member for RSA Conference. "That's important, but one of the first things I learned about computers — 'GIGO,' or 'garbage in, garbage out' — still applies with automation and machine intelligence." 

The other issue is that automation tends to follow a maturity path where the most automated systems are never fully up to date with the timeliest threat trends. As a result, there always need to be experienced humans who are adaptable enough to deal with the unknown threats of tomorrow, says Lucas Moody, CISO for Palo Alto Networks.

"If you break it down, automation is about taking care of yesterday's problems. We are automating what we've mastered and what we understand well," says Moody. "In order to tackle tomorrow's challenges, we need to hire professionals who are strategic, creative, and adaptable. We're really looking for those individuals who thrive on change and problem-solving." 

Related Content:

Ericka Chickowski specializes in coverage of information technology and business innovation. She has focused on information security for the better part of a decade and regularly writes about the security industry as a contributor to Dark Reading.  View Full Bio

Comment  | 
Email This  | 
Print  | 
RSS
More Insights
Copyright © 2019 UBM Electronics, A UBM company, All rights reserved. Privacy Policy | Terms of Service