3 Lessons Security Leaders Can Learn from Theranos
Theranos flamed out in spectacular fashion, but you can still learn from the company's "worst practices."
In Alex Gibney's absorbing new HBO documentary, "The Inventor: Out for Blood in Silicon Valley," we see the cautionary tale of Elizabeth Holmes, the now infamous entrepreneur who dropped out of Stanford at age 19 to start Theranos. The company promised to disrupt the $36 billion blood-testing market by testing for a wide range of diseases via a single drop of blood, but turned out to be a massive fraud that bilked investors out of billions of dollars and put the lives of consumers at risk.
As someone who has worked in the cybersecurity industry for more than two decades, I couldn't help but think about some of the overarching themes related to how Holmes and her consiglieri, Sunny Balwani, operated Theranos and what security leaders might take away from their "worst practices."
Know Your Silos
Countless business articles caution against the risk of operational silos, but you'd be hard pressed to find a more systemic and flagrant example than in the story of Theranos, in which the engineering team responsible for building the machine were quite literally segregated from the laboratory chemists who were responsible for its testing results. In the film, Theranos engineer Dave Philippides says, "If the people from the chemistry team could talk about what was coming next from the engineering team, they would have said, 'that's not going to solve the problem.' But since everyone was working on it separately, they could all keep on working forever without solving anything."
At Theranos, the silos were a feature, not a bug. Regardless, it should serve as a reminder for security leaders to be aware of their own silo blind spots and ask themselves how information, ideas, and vulnerabilities are shared — or not — across their organizations. How do problems get communicated up the chain of command? What incentives — or disincentives — are in place that might compromise the way information is exchanged?
Beware False Positives
One of the more interesting questions to emerge from the film was whether it was necessarily even a good idea for consumers to order their own blood tests without a doctor's prescription. Theranos successfully lobbied the Arizona legislature to "democratize" blood testing, and while there are compelling arguments behind this effort, Dr. Stefanie Seitz makes the case in the film that without context, consumers ordering these tests would either be lulled into a false sense of complacency or convince themselves that they have cancer, saying: "You can't just look at a lab; you have to look at the whole patient."
In a similar vein, Katherine Hobson at fivethirtyeight.com makes the case that even most accurate diagnostic tests applied to a large population can yield a disproportionate share of false-positive results: "The wider the pool of people being tested, the greater the chance of false positives, which is why screening guidelines generally limit the population to be screened. The more independent tests you do at once, each with its own chance of error, the larger the chance that at least one of those tests produces an incorrect result."
There's a tendency in the current business landscape to conflate the notion that having access to more data points will lead to better intelligence and more informed decisions. But as any seasoned security practitioner can attest, a multitude of security information and event management and network appliances can trigger so many alerts and false positives that it becomes essentially useless. This is why mature security operations centers are increasingly emphasizing the crucial role that context has in helping them separate the signal from the noise.
A Culture of Fear Stifles Accountability
Theranos shielded itself from external scrutiny for such a long time in large part because Holmes and Balwani built a corporate culture steeped in fear and retaliation, which ensured that the very real problems its engineering team was facing with building the company's machine could not be adequately addressed. As reporter John Carreyou puts it in his book about Theranos, Bad Blood: "The biggest problem of all was the dysfunctional corporate culture in which it was being developed. Holmes and Balwani regarded anyone who raised a concern or an objection as a cynic and a nay-sayer. Employees who persisted in doing so were usually marginalized or fired, while sycophants were promoted."
Security leaders would do well to ask themselves: Does our corporate culture allow and encourage dissent from the lower ranks? All too often you hear stories of network engineers who discover a serious vulnerability but are too afraid to voice his or her concern because they are afraid of the potential fallout. It's challenging to instill a sense of accountability across an organization if it's not being embodied by its leadership team.
Surely, Theranos won't be the last tech company to burn up in such a spectacular fashion. But hopefully we as security leaders can take a more constructive approach that will help to identify our own organizational deficiencies.
Related Content:
Join Dark Reading LIVE for two cybersecurity summits at Interop 2019. Learn from the industry's most knowledgeable IT security experts. Check out the Interop agenda here.
About the Author
You May Also Like
Cybersecurity Day: How to Automate Security Analytics with AI and ML
Dec 17, 2024The Dirt on ROT Data
Dec 18, 2024