Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Vulnerabilities / Threats

02:30 PM
Chad Loeven
Chad Loeven
Connect Directly
E-Mail vvv

3 Lessons Security Leaders Can Learn from Theranos

Theranos flamed out in spectacular fashion, but you can still learn from the company's "worst practices."

In Alex Gibney's absorbing new HBO documentary, "The Inventor: Out for Blood in Silicon Valley," we see the cautionary tale of Elizabeth Holmes, the now infamous entrepreneur who dropped out of Stanford at age 19 to start Theranos. The company promised to disrupt the $36 billion blood-testing market by testing for a wide range of diseases via a single drop of blood, but turned out to be a massive fraud that bilked investors out of billions of dollars and put the lives of consumers at risk.

As someone who has worked in the cybersecurity industry for more than two decades, I couldn't help but think about some of the overarching themes related to how Holmes and her consiglieri, Sunny Balwani, operated Theranos and what security leaders might take away from their "worst practices."

Know Your Silos
Countless business articles caution against the risk of operational silos, but you'd be hard pressed to find a more systemic and flagrant example than in the story of Theranos, in which the engineering team responsible for building the machine were quite literally segregated from the laboratory chemists who were responsible for its testing results. In the film, Theranos engineer Dave Philippides says, "If the people from the chemistry team could talk about what was coming next from the engineering team, they would have said, 'that's not going to solve the problem.' But since everyone was working on it separately, they could all keep on working forever without solving anything."

At Theranos, the silos were a feature, not a bug. Regardless, it should serve as a reminder for security leaders to be aware of their own silo blind spots and ask themselves how information, ideas, and vulnerabilities are shared — or not — across their organizations. How do problems get communicated up the chain of command? What incentives — or disincentives — are in place that might compromise the way information is exchanged?

Beware False Positives
One of the more interesting questions to emerge from the film was whether it was necessarily even a good idea for consumers to order their own blood tests without a doctor's prescription. Theranos successfully lobbied the Arizona legislature to "democratize" blood testing, and while there are compelling arguments behind this effort, Dr. Stefanie Seitz makes the case in the film that without context, consumers ordering these tests would either be lulled into a false sense of complacency or convince themselves that they have cancer, saying: "You can't just look at a lab; you have to look at the whole patient."

In a similar vein, Katherine Hobson at fivethirtyeight.com makes the case that even most accurate diagnostic tests applied to a large population can yield a disproportionate share of false-positive results: "The wider the pool of people being tested, the greater the chance of false positives, which is why screening guidelines generally limit the population to be screened. The more independent tests you do at once, each with its own chance of error, the larger the chance that at least one of those tests produces an incorrect result."  

There's a tendency in the current business landscape to conflate the notion that having access to more data points will lead to better intelligence and more informed decisions. But as any seasoned security practitioner can attest, a multitude of security information and event management and network appliances can trigger so many alerts and false positives that it becomes essentially useless. This is why mature security operations centers are increasingly emphasizing the crucial role that context has in helping them separate the signal from the noise.

A Culture of Fear Stifles Accountability
Theranos shielded itself from external scrutiny for such a long time in large part because Holmes and Balwani built a corporate culture steeped in fear and retaliation, which ensured that the very real problems its engineering team was facing with building the company's machine could not be adequately addressed. As reporter John Carreyou puts it in his book about Theranos, Bad Blood: "The biggest problem of all was the dysfunctional corporate culture in which it was being developed. Holmes and Balwani regarded anyone who raised a concern or an objection as a cynic and a nay-sayer. Employees who persisted in doing so were usually marginalized or fired, while sycophants were promoted."

Security leaders would do well to ask themselves: Does our corporate culture allow and encourage dissent from the lower ranks? All too often you hear stories of network engineers who discover a serious vulnerability but are too afraid to voice his or her concern because they are afraid of the potential fallout. It's challenging to instill a sense of accountability across an organization if it's not being embodied by its leadership team.

Surely, Theranos won't be the last tech company to burn up in such a spectacular fashion. But hopefully we as security leaders can take a more constructive approach that will help to identify our own organizational deficiencies.

Related Content:



Join Dark Reading LIVE for two cybersecurity summits at Interop 2019. Learn from the industry's most knowledgeable IT security experts. Check out the Interop agenda here.

Chad Loeven has been involved in enterprise security for over 20 years. Prior to VMRay he managed technology alliances at RSA, the security division of EMC. He came on board RSA via its acquisition of Silicium Security and Silicium's ECAT ETDR (Endpoint Threat Detection and ... View Full Bio
Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
User Rank: Ninja
4/4/2019 | 3:31:38 PM
The Vision is flawed: Holmes v. Jobs and Disney
Walt once said " Ya know, it's kind of fun doing the impossible."  And his studio produced the impossible animation turned possible into brilliance.  Genius.  Jobs would say the same and he admired Disney as a true innovator and, of course, picked up Pixar and ran with that into the Disney world while rebuilding Apple.  A true innovator of the first order following a leading pioneeer.  Holmes would agree but what she was doing was technically impossible due to the design of her beast.  It was REALLY impossible to pack a complex blood test protocol into a small box.  The technology of today just doesn't permit a lab full tilt into a package. 

Literally impossible and she denied that, a true believer.  That is a visionary.  A good trait when Jobs employed his famous reality distortion field and he could make Apple bend to his will - result, Macintosh, I Phone, pod, Pad.  The compter itself was possible, animation was and is possible but Holmes was bleeding edge impossible.  So beware the genius vision for if the idea itself is flawed, nothing beyond that would work.  Jobs and Disney re-designed existing tech and protocols for their vision.  Holmes was building a new one and, worse, not realizing it did not, could not work at least with tech as it stands today.    A micro-processor did not exist in 1944 for example.  Maybe 20 years from now we can get what she dreamed of, but not today.  Too early for her time and too many investors felt her pain.
Navigating Security in the Cloud
Diya Jolly, Chief Product Officer, Okta,  12/4/2019
4 Tips to Run Fast in the Face of Digital Transformation
Shane Buckley, President & Chief Operating Officer, Gigamon,  12/9/2019
Register for Dark Reading Newsletters
White Papers
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: Our Endpoint Protection system is a little outdated... 
Current Issue
The Year in Security: 2019
This Tech Digest provides a wrap up and overview of the year's top cybersecurity news stories. It was a year of new twists on old threats, with fears of another WannaCry-type worm and of a possible botnet army of Wi-Fi routers. But 2019 also underscored the risk of firmware and trusted security tools harboring dangerous holes that cybercriminals and nation-state hackers could readily abuse. Read more.
Flash Poll
Rethinking Enterprise Data Defense
Rethinking Enterprise Data Defense
Frustrated with recurring intrusions and breaches, cybersecurity professionals are questioning some of the industrys conventional wisdom. Heres a look at what theyre thinking about.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
PUBLISHED: 2019-12-11
Orca has arbitrary code execution due to insecure Python module load
PUBLISHED: 2019-12-11
RubyGem omniauth-facebook has an access token security vulnerability
PUBLISHED: 2019-12-11
JBossWeb Bayeux has reflected XSS
PUBLISHED: 2019-12-11
node-connect before 2.8.2 has cross site scripting in methodOverride Middleware
PUBLISHED: 2019-12-11
Progress Telerik UI for ASP.NET AJAX through 2019.3.1023 contains a .NET deserialization vulnerability in the RadAsyncUpload function. This is exploitable when the encryption keys are known due to the presence of CVE-2017-11317 or CVE-2017-11357, or other means. Exploitation can result in remote cod...