Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Vulnerabilities / Threats

4/4/2019
02:30 PM
Chad Loeven
Chad Loeven
Commentary
Connect Directly
Twitter
LinkedIn
RSS
E-Mail vvv
50%
50%

3 Lessons Security Leaders Can Learn from Theranos

Theranos flamed out in spectacular fashion, but you can still learn from the company's "worst practices."

In Alex Gibney's absorbing new HBO documentary, "The Inventor: Out for Blood in Silicon Valley," we see the cautionary tale of Elizabeth Holmes, the now infamous entrepreneur who dropped out of Stanford at age 19 to start Theranos. The company promised to disrupt the $36 billion blood-testing market by testing for a wide range of diseases via a single drop of blood, but turned out to be a massive fraud that bilked investors out of billions of dollars and put the lives of consumers at risk.

As someone who has worked in the cybersecurity industry for more than two decades, I couldn't help but think about some of the overarching themes related to how Holmes and her consiglieri, Sunny Balwani, operated Theranos and what security leaders might take away from their "worst practices."

Know Your Silos
Countless business articles caution against the risk of operational silos, but you'd be hard pressed to find a more systemic and flagrant example than in the story of Theranos, in which the engineering team responsible for building the machine were quite literally segregated from the laboratory chemists who were responsible for its testing results. In the film, Theranos engineer Dave Philippides says, "If the people from the chemistry team could talk about what was coming next from the engineering team, they would have said, 'that's not going to solve the problem.' But since everyone was working on it separately, they could all keep on working forever without solving anything."

At Theranos, the silos were a feature, not a bug. Regardless, it should serve as a reminder for security leaders to be aware of their own silo blind spots and ask themselves how information, ideas, and vulnerabilities are shared — or not — across their organizations. How do problems get communicated up the chain of command? What incentives — or disincentives — are in place that might compromise the way information is exchanged?

Beware False Positives
One of the more interesting questions to emerge from the film was whether it was necessarily even a good idea for consumers to order their own blood tests without a doctor's prescription. Theranos successfully lobbied the Arizona legislature to "democratize" blood testing, and while there are compelling arguments behind this effort, Dr. Stefanie Seitz makes the case in the film that without context, consumers ordering these tests would either be lulled into a false sense of complacency or convince themselves that they have cancer, saying: "You can't just look at a lab; you have to look at the whole patient."

In a similar vein, Katherine Hobson at fivethirtyeight.com makes the case that even most accurate diagnostic tests applied to a large population can yield a disproportionate share of false-positive results: "The wider the pool of people being tested, the greater the chance of false positives, which is why screening guidelines generally limit the population to be screened. The more independent tests you do at once, each with its own chance of error, the larger the chance that at least one of those tests produces an incorrect result."  

There's a tendency in the current business landscape to conflate the notion that having access to more data points will lead to better intelligence and more informed decisions. But as any seasoned security practitioner can attest, a multitude of security information and event management and network appliances can trigger so many alerts and false positives that it becomes essentially useless. This is why mature security operations centers are increasingly emphasizing the crucial role that context has in helping them separate the signal from the noise.

A Culture of Fear Stifles Accountability
Theranos shielded itself from external scrutiny for such a long time in large part because Holmes and Balwani built a corporate culture steeped in fear and retaliation, which ensured that the very real problems its engineering team was facing with building the company's machine could not be adequately addressed. As reporter John Carreyou puts it in his book about Theranos, Bad Blood: "The biggest problem of all was the dysfunctional corporate culture in which it was being developed. Holmes and Balwani regarded anyone who raised a concern or an objection as a cynic and a nay-sayer. Employees who persisted in doing so were usually marginalized or fired, while sycophants were promoted."

Security leaders would do well to ask themselves: Does our corporate culture allow and encourage dissent from the lower ranks? All too often you hear stories of network engineers who discover a serious vulnerability but are too afraid to voice his or her concern because they are afraid of the potential fallout. It's challenging to instill a sense of accountability across an organization if it's not being embodied by its leadership team.

Surely, Theranos won't be the last tech company to burn up in such a spectacular fashion. But hopefully we as security leaders can take a more constructive approach that will help to identify our own organizational deficiencies.

Related Content:

 

 

Join Dark Reading LIVE for two cybersecurity summits at Interop 2019. Learn from the industry's most knowledgeable IT security experts. Check out the Interop agenda here.

Chad Loeven has been involved in enterprise security for over 20 years. Prior to VMRay he managed technology alliances at RSA, the security division of EMC. He came on board RSA via its acquisition of Silicium Security and Silicium's ECAT ETDR (Endpoint Threat Detection and ... View Full Bio
 

Recommended Reading:

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
REISEN1955
100%
0%
REISEN1955,
User Rank: Ninja
4/4/2019 | 3:31:38 PM
The Vision is flawed: Holmes v. Jobs and Disney
Walt once said " Ya know, it's kind of fun doing the impossible."  And his studio produced the impossible animation turned possible into brilliance.  Genius.  Jobs would say the same and he admired Disney as a true innovator and, of course, picked up Pixar and ran with that into the Disney world while rebuilding Apple.  A true innovator of the first order following a leading pioneeer.  Holmes would agree but what she was doing was technically impossible due to the design of her beast.  It was REALLY impossible to pack a complex blood test protocol into a small box.  The technology of today just doesn't permit a lab full tilt into a package. 

Literally impossible and she denied that, a true believer.  That is a visionary.  A good trait when Jobs employed his famous reality distortion field and he could make Apple bend to his will - result, Macintosh, I Phone, pod, Pad.  The compter itself was possible, animation was and is possible but Holmes was bleeding edge impossible.  So beware the genius vision for if the idea itself is flawed, nothing beyond that would work.  Jobs and Disney re-designed existing tech and protocols for their vision.  Holmes was building a new one and, worse, not realizing it did not, could not work at least with tech as it stands today.    A micro-processor did not exist in 1944 for example.  Maybe 20 years from now we can get what she dreamed of, but not today.  Too early for her time and too many investors felt her pain.
News
Inside the Ransomware Campaigns Targeting Exchange Servers
Kelly Sheridan, Staff Editor, Dark Reading,  4/2/2021
Commentary
Beyond MITRE ATT&CK: The Case for a New Cyber Kill Chain
Rik Turner, Principal Analyst, Infrastructure Solutions, Omdia,  3/30/2021
Register for Dark Reading Newsletters
White Papers
Video
Cartoon
Current Issue
2021 Top Enterprise IT Trends
We've identified the key trends that are poised to impact the IT landscape in 2021. Find out why they're important and how they will affect you today!
Flash Poll
How Enterprises are Developing Secure Applications
How Enterprises are Developing Secure Applications
Recent breaches of third-party apps are driving many organizations to think harder about the security of their off-the-shelf software as they continue to move left in secure software development practices.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2021-23371
PUBLISHED: 2021-04-12
This affects the package chrono-node before 2.2.4. It hangs on a date-like string with lots of embedded spaces.
CVE-2020-24285
PUBLISHED: 2021-04-12
INTELBRAS TELEFONE IP TIP200 version 60.61.75.22 allows an attacker to obtain sensitive information through /cgi-bin/cgiServer.exx.
CVE-2021-29379
PUBLISHED: 2021-04-12
** UNSUPPORTED WHEN ASSIGNED ** An issue was discovered on D-Link DIR-802 A1 devices through 1.00b05. Universal Plug and Play (UPnP) is enabled by default on port 1900. An attacker can perform command injection by injecting a payload into the Search Target (ST) field of the SSDP M-SEARCH discover pa...
CVE-2015-20001
PUBLISHED: 2021-04-11
In the standard library in Rust before 1.2.0, BinaryHeap is not panic-safe. The binary heap is left in an inconsistent state when the comparison of generic elements inside sift_up or sift_down_range panics. This bug leads to a drop of zeroed memory as an arbitrary type, which can result in a memory ...
CVE-2020-36317
PUBLISHED: 2021-04-11
In the standard library in Rust before 1.49.0, String::retain() function has a panic safety problem. It allows creation of a non-UTF-8 Rust string when the provided closure panics. This bug could result in a memory safety violation when other string APIs assume that UTF-8 encoding is used on the sam...