Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Endpoint

5/12/2020
10:00 AM
Connect Directly
LinkedIn
RSS
E-Mail vvv
100%
0%

Coronavirus, Data Privacy & the New Online Social Contract

How governments can protect personal privacy in contact tracing while saving peoples' lives

When Jean-Jacques Rousseau wrote The Social Contract in 1762, he argued that only humans possess sovereign power, and that they alone may choose which freedoms they surrender in exchange for the benefits and stability of government. Now, for the first time in more than a century, we are debating amending or rebalancing aspects of the social contract in order to deal with a deadly pandemic.

One of the key challenges associated with containing the spread of the coronavirus that causes COVID-19 is contact tracing: identifying other individuals and groups with whom a COVID-19-positive individual may have been in contact. Under normal circumstances, the mere idea of using any form of mobile phone data to track users en masse for a purpose they never consented to would be anathema to the spirit of regulations like GDPR and CCPA. But, of course, these are not normal circumstances.

COVID-19 contact tracing is different in that complete anonymization is not possible when identifying COVID-19-positive individuals. To protect others, health systems already track COVID-19 cases and do everything in their power to perform contact tracing. The question is: How can technology help in a way that doesn't fundamentally violate our expectations around privacy?

Privacy vs. Public Health
Governments could use or access mobile phone location information without user consent. The drawback of this is clear: If governments can justify accessing this data in this circumstance, in what other contexts might they also unilaterally decide to use it after COVID-19 has passed? There are also purely opt-in approaches where individuals who want to participate in contact tracing can download an app. But you need a very large number of people to run an app for a contact-tracing program built around it to be effective.

Google and Apple have proposed an intriguing middle ground. By building new capabilities into the iOS and Android operating systems specifically to enable close proximity contact tracing with some anonymization built in, they are using technology that helps limit data collection and analysis to just what is essential. For example, the system will use Bluetooth signals, which have an inherently limited range, and only the ability to determine relative proximity to other devices, while banning the use of location tracking, according to MIT Technology Review, which would store a device's absolute geographic position. When these capabilities are released, if they are turned on by default, they could allow apps built on the platform to gain more users and result in more effective contact-tracing programs.

With greater access to this information, governments could have accurate contact tracing, which would allow the entire societal approach to prevention, containment, and mitigation of COVID-19 to improve. But the pivotal word here is "could." Privacy advocates are quick to point out there's no way of knowing with certainty that this data would improve contact-tracing efficacy enough to save many lives. But there's really no way of knowing until access is granted, and there's no way to get the people of an entire country — let alone an entire planet — to consent to their data being utilized.

The Andon Cord, Digital Edition
The possibility that technology we already have could protect us against the pandemic is too great a public health benefit to not at least explore. The question is, how to guard against the obvious downsides of eroding data privacy controls? The answer lies in recognizing this is an extraordinary circumstance, and should be handled as such. Many industries have spent a fair bit of time thinking about how to deal with emergencies. In the heyday of manufacturing, when a problem arose in a Toyota factory, employees were empowered to halt the assembly line the moment anyone discovered a problem. The trigger was pulling the "andon cord" so that team leaders and workers could huddle together to solve the problem and restart production following formalized steps.

Today, governments need a similar system they can utilize when a dire emergency overrides privacy concerns. But an andon cord of data usage should have three major components:

1. A point of instigation. The protocol should indicate factors for determining catastrophe on a spectrum — for instance, the highest level would be if the continuation of our species was at risk.

2. A point of demarcation. Privacy limits need to be reimposed after de-escalation. That should be set with an actual date and time at the outset of the andon cord pull.

3. A point of privacy. Wherever additional data is collected, it should be collected in a privacy-preserving fashion if possible. Approaches such as MIT's Private Kit, a contact-tracing app, allows infected persons to share their location trail to health officials, but that information is anonymized and patient data is stored locally.

It's necessary for governments to outline the three components above publicly, placing emphasis on steps they take to maintain privacy and the exact date they plan on stopping data usage. In the Patriot Act of 2001, Congress agreed to extend the government's powers in exchange for greater national security for a period of time, but the sunsetting kept being postponed. That cannot happen in an andon cord situation. Any time a government violates a social contract, it risks losing the trust of the public. Without transparency around pulling a data andon cord, there will be backlash. And that backlash would likely end in people actively blocking their devices, defeating the whole point of utilizing that data for public health.

Related Content:

Check out The Edge, Dark Reading's new section for features, threat data, and in-depth perspectives. Today's featured story: "Cybersecurity Home School: Garfield Teaches Security."

Shuman Ghosemajumder is global head of artificial intelligence at F5 Networks (NASDAQ: FFIV). Shuman was previously chief technology officer of Shape Security, which was acquired by F5 in 2020. Shape's technology platform is the primary application defense for the world's ... View Full Bio
 

Recommended Reading:

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
News
Former CISA Director Chris Krebs Discusses Risk Management & Threat Intel
Kelly Sheridan, Staff Editor, Dark Reading,  2/23/2021
Edge-DRsplash-10-edge-articles
Security + Fraud Protection: Your One-Two Punch Against Cyberattacks
Joshua Goldfarb, Director of Product Management at F5,  2/23/2021
News
Cybercrime Groups More Prolific, Focus on Healthcare in 2020
Robert Lemos, Contributing Writer,  2/22/2021
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Write a Caption, Win an Amazon Gift Card! Click Here
Latest Comment: This comment is waiting for review by our moderators.
Current Issue
2021 Top Enterprise IT Trends
We've identified the key trends that are poised to impact the IT landscape in 2021. Find out why they're important and how they will affect you today!
Flash Poll
Building the SOC of the Future
Building the SOC of the Future
Digital transformation, cloud-focused attacks, and a worldwide pandemic. The past year has changed the way business works and the way security teams operate. There is no going back.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2021-25329
PUBLISHED: 2021-03-01
The fix for CVE-2020-9484 was incomplete. When using Apache Tomcat 10.0.0-M1 to 10.0.0, 9.0.0.M1 to 9.0.41, 8.5.0 to 8.5.61 or 7.0.0. to 7.0.107 with a configuration edge case that was highly unlikely to be used, the Tomcat instance was still vulnerable to CVE-2020-9494. Note that both the previousl...
CVE-2021-25122
PUBLISHED: 2021-03-01
When responding to new h2c connection requests, Apache Tomcat versions 10.0.0-M1 to 10.0.0, 9.0.0.M1 to 9.0.41 and 8.5.0 to 8.5.61 could duplicate request headers and a limited amount of request body from one request to another meaning user A and user B could both see the results of user A's request...
CVE-2021-27225
PUBLISHED: 2021-03-01
In Dataiku DSS before 8.0.6, insufficient access control in the Jupyter notebooks integration allows users (who have coding permissions) to read and overwrite notebooks in projects that they are not authorized to access.
CVE-2021-27132
PUBLISHED: 2021-02-27
SerComm AG Combo VD625 AGSOT_2.1.0 devices allow CRLF injection (for HTTP header injection) in the download function via the Content-Disposition header.
CVE-2021-25284
PUBLISHED: 2021-02-27
An issue was discovered in through SaltStack Salt before 3002.5. salt.modules.cmdmod can log credentials to the info or error log level.