Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Vulnerabilities / Threats

02:30 PM
Chad Loeven
Chad Loeven
Connect Directly
E-Mail vvv

3 Lessons Security Leaders Can Learn from Theranos

Theranos flamed out in spectacular fashion, but you can still learn from the company's "worst practices."

In Alex Gibney's absorbing new HBO documentary, "The Inventor: Out for Blood in Silicon Valley," we see the cautionary tale of Elizabeth Holmes, the now infamous entrepreneur who dropped out of Stanford at age 19 to start Theranos. The company promised to disrupt the $36 billion blood-testing market by testing for a wide range of diseases via a single drop of blood, but turned out to be a massive fraud that bilked investors out of billions of dollars and put the lives of consumers at risk.

As someone who has worked in the cybersecurity industry for more than two decades, I couldn't help but think about some of the overarching themes related to how Holmes and her consiglieri, Sunny Balwani, operated Theranos and what security leaders might take away from their "worst practices."

Know Your Silos
Countless business articles caution against the risk of operational silos, but you'd be hard pressed to find a more systemic and flagrant example than in the story of Theranos, in which the engineering team responsible for building the machine were quite literally segregated from the laboratory chemists who were responsible for its testing results. In the film, Theranos engineer Dave Philippides says, "If the people from the chemistry team could talk about what was coming next from the engineering team, they would have said, 'that's not going to solve the problem.' But since everyone was working on it separately, they could all keep on working forever without solving anything."

At Theranos, the silos were a feature, not a bug. Regardless, it should serve as a reminder for security leaders to be aware of their own silo blind spots and ask themselves how information, ideas, and vulnerabilities are shared — or not — across their organizations. How do problems get communicated up the chain of command? What incentives — or disincentives — are in place that might compromise the way information is exchanged?

Beware False Positives
One of the more interesting questions to emerge from the film was whether it was necessarily even a good idea for consumers to order their own blood tests without a doctor's prescription. Theranos successfully lobbied the Arizona legislature to "democratize" blood testing, and while there are compelling arguments behind this effort, Dr. Stefanie Seitz makes the case in the film that without context, consumers ordering these tests would either be lulled into a false sense of complacency or convince themselves that they have cancer, saying: "You can't just look at a lab; you have to look at the whole patient."

In a similar vein, Katherine Hobson at fivethirtyeight.com makes the case that even most accurate diagnostic tests applied to a large population can yield a disproportionate share of false-positive results: "The wider the pool of people being tested, the greater the chance of false positives, which is why screening guidelines generally limit the population to be screened. The more independent tests you do at once, each with its own chance of error, the larger the chance that at least one of those tests produces an incorrect result."  

There's a tendency in the current business landscape to conflate the notion that having access to more data points will lead to better intelligence and more informed decisions. But as any seasoned security practitioner can attest, a multitude of security information and event management and network appliances can trigger so many alerts and false positives that it becomes essentially useless. This is why mature security operations centers are increasingly emphasizing the crucial role that context has in helping them separate the signal from the noise.

A Culture of Fear Stifles Accountability
Theranos shielded itself from external scrutiny for such a long time in large part because Holmes and Balwani built a corporate culture steeped in fear and retaliation, which ensured that the very real problems its engineering team was facing with building the company's machine could not be adequately addressed. As reporter John Carreyou puts it in his book about Theranos, Bad Blood: "The biggest problem of all was the dysfunctional corporate culture in which it was being developed. Holmes and Balwani regarded anyone who raised a concern or an objection as a cynic and a nay-sayer. Employees who persisted in doing so were usually marginalized or fired, while sycophants were promoted."

Security leaders would do well to ask themselves: Does our corporate culture allow and encourage dissent from the lower ranks? All too often you hear stories of network engineers who discover a serious vulnerability but are too afraid to voice his or her concern because they are afraid of the potential fallout. It's challenging to instill a sense of accountability across an organization if it's not being embodied by its leadership team.

Surely, Theranos won't be the last tech company to burn up in such a spectacular fashion. But hopefully we as security leaders can take a more constructive approach that will help to identify our own organizational deficiencies.

Related Content:



Join Dark Reading LIVE for two cybersecurity summits at Interop 2019. Learn from the industry's most knowledgeable IT security experts. Check out the Interop agenda here.

Chad Loeven has been involved in enterprise security for over 20 years. Prior to VMRay he managed technology alliances at RSA, the security division of EMC. He came on board RSA via its acquisition of Silicium Security and Silicium's ECAT ETDR (Endpoint Threat Detection and ... View Full Bio

Recommended Reading:

Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
User Rank: Ninja
4/4/2019 | 3:31:38 PM
The Vision is flawed: Holmes v. Jobs and Disney
Walt once said " Ya know, it's kind of fun doing the impossible."  And his studio produced the impossible animation turned possible into brilliance.  Genius.  Jobs would say the same and he admired Disney as a true innovator and, of course, picked up Pixar and ran with that into the Disney world while rebuilding Apple.  A true innovator of the first order following a leading pioneeer.  Holmes would agree but what she was doing was technically impossible due to the design of her beast.  It was REALLY impossible to pack a complex blood test protocol into a small box.  The technology of today just doesn't permit a lab full tilt into a package. 

Literally impossible and she denied that, a true believer.  That is a visionary.  A good trait when Jobs employed his famous reality distortion field and he could make Apple bend to his will - result, Macintosh, I Phone, pod, Pad.  The compter itself was possible, animation was and is possible but Holmes was bleeding edge impossible.  So beware the genius vision for if the idea itself is flawed, nothing beyond that would work.  Jobs and Disney re-designed existing tech and protocols for their vision.  Holmes was building a new one and, worse, not realizing it did not, could not work at least with tech as it stands today.    A micro-processor did not exist in 1944 for example.  Maybe 20 years from now we can get what she dreamed of, but not today.  Too early for her time and too many investors felt her pain.
COVID-19: Latest Security News & Commentary
Dark Reading Staff 8/14/2020
Lock-Pickers Face an Uncertain Future Online
Seth Rosenblatt, Contributing Writer,  8/10/2020
Hacking It as a CISO: Advice for Security Leadership
Kelly Sheridan, Staff Editor, Dark Reading,  8/10/2020
Register for Dark Reading Newsletters
White Papers
Current Issue
7 New Cybersecurity Vulnerabilities That Could Put Your Enterprise at Risk
In this Dark Reading Tech Digest, we look at the ways security researchers and ethical hackers find critical vulnerabilities and offer insights into how you can fix them before attackers can exploit them.
Flash Poll
The Changing Face of Threat Intelligence
The Changing Face of Threat Intelligence
This special report takes a look at how enterprises are using threat intelligence, as well as emerging best practices for integrating threat intel into security operations and incident response. Download it today!
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
PUBLISHED: 2020-08-14
Lack of authentication in the network relays used in MEGVII Koala 2.9.1-c3s allows attackers to grant physical access to anyone by sending packet data to UDP port 5000.
PUBLISHED: 2020-08-14
** REJECT ** DO NOT USE THIS CANDIDATE NUMBER. ConsultIDs: CVE-2020-10751. Reason: This candidate is a duplicate of CVE-2020-10751. Notes: All CVE users should reference CVE-2020-10751 instead of this candidate. All references and descriptions in this candidate have been removed to prevent accidenta...
PUBLISHED: 2020-08-14
** REJECT ** DO NOT USE THIS CANDIDATE NUMBER. ConsultIDs: CVE-2017-18270. Reason: This candidate is a duplicate of CVE-2017-18270. Notes: All CVE users should reference CVE-2017-18270 instead of this candidate. All references and descriptions in this candidate have been removed to prevent accidenta...
PUBLISHED: 2020-08-14
** REJECT ** DO NOT USE THIS CANDIDATE NUMBER. ConsultIDs: none. Reason: This candidate was withdrawn by its CNA. Further investigation showed that it was not a security issue. Notes: none.
PUBLISHED: 2020-08-14
Lack of mutual authentication in ZKTeco FaceDepot 7B 1.0.213 and ZKBiosecurity Server 1.0.0_20190723 allows an attacker to obtain a long-lasting token by impersonating the server.