Omer Tene, VP, International Association of Privacy Professionals (IAPP), also contributed to this article.
By any measure, this summer has been a busy time for privacy news. It started with a flurry of enforcement activity in Europe, including announcements from the UK privacy regulator of fines in the amount of $230 million against British Airways and $125 million against Marriott. It continued with a high-stakes standoff in Europe's highest court between Max Schrems (a prominent privacy advocate), Facebook, and the Irish Data Protection Commissioner, which could jeopardize the future of transatlantic data flows. Finally, it ended with a big bang, with news publicly released to the humdrum of a summery Friday afternoon of the FTC's $5 billion fine against Facebook in connection with the Cambridge Analytica scandal.
The message resonated loud and clear in corporate boardrooms from Silicon Valley to London: Privacy has become a first-order media and regulatory concern.
How should businesses respond to this new drumbeat of privacy outcries and enforcement actions? The risks of data mismanagement -- measuring hundreds of millions of dollars and including security breaches, inappropriate information sharing, and "creepy" data uses -- are no longer an acceptable cost of doing business, making it abundantly clear that society cannot experience the full benefits of a digital economy without investing in privacy.
The good news is that the public has recognized the gravity of the problem. Breakthroughs in healthcare, smart traffic, connected communities, and artificial intelligence (AI) confer tremendous societal benefits but, at the same time, create chilling privacy risks. The bad news is that we're hardly ready to address these issues. As Berkeley professors Deirdre Mulligan and Kenneth Bamberger wrote in Privacy on the Ground: Driving Corporate Behavior in the United States and Europe, it's one thing to have privacy "on the books," but it's quite another thing to have privacy "on the ground."
According to research by the International Association of Privacy Professionals (IAPP), more than 500,000 organizations have already registered data protection officers in Europe. Yet only a fraction of those roles can actually be staffed by individuals who are trained on privacy law, technologies, and operations. To rein in data flows across thousands of data systems, sprawling networks of vendors, cloud architectures, and machine learning algorithms, organizations large and small must deploy highly qualified people, technologies, and processes that are still in the early developmental stage.
First, the people who will serve as foot soldiers of this army of professionals must be modern-day renaissance persons. They have to be well-versed on the technology, engineering, management, law, ethics, and policy of the digital economy. They need to apply lofty principles like privacy, equality, and freedom in day-to-day operational settings to disruptive tech innovations such as facial recognition, consumer genetics, and AI. They need to not only understand the logic underlying black box machine learning processes but also the mechanics of algorithmic decision-making and the social and ethical norms that govern them. Unfortunately, existing academic curricula are siloed in areas such as law, engineering, and management. Government, academic, and accreditation bodies should work to lower the walls between disciplines to ensure that lawyers and ethicists talk not only to each other but also with computer scientists, IT professionals, and engineers.
Second, researchers and entrepreneurs are building a vast array of technologies to help companies and individuals protect privacy and data. Just last week, OneTrust, a privacy tech vendor, raised $200 million at a valuation of $1.3 billion, making it the first privacy tech unicorn merely three years after its launch. Some of these new technologies help organizations better handle their privacy compliance and data management obligations. Others provide consumers with tools to protect and manage their own data through de-identification, encryption, obfuscation, or identity management. Over the next few years, governments and policymakers should give organizations incentives to innovate not only around data analytics and use but also around protection of privacy, identity, and confidentiality.
Third, organizations should deploy data governance processes and best practices to ensure responsible and accountable data practices. Such processes include privacy impact assessments, consent management platforms, data mapping and inventories, and ongoing accountability audits. With guidance from regulators and frameworks from standard-setting bodies, such as the National Institute of Standards and Technology, procedural best practices will develop for both public and private sector players.
Like so many complex societal issues, privacy concerns require a matrix of responses. We certainly need strong laws and effective enforcement, but organizations should also embrace their stewardship of data and invest in the processes and technologies to better manage their data stores. Importantly, we need to continue to educate and train professionals with the knowledge and skills to make ethical, responsible decisions about how data is handled. To facilitate innovative data uses and unlock the benefits of new technologies, we need privacy not only in the books but also on the ground.
- 6 Actions That Made GDPR Real in 2019
- 'Phoning Home': Your Latest Data Exfiltration Headache
- The California Consumer Privacy Act's Hidden Surprise Has Big Legal Consequences
Check out The Edge, Dark Reading's new section for features, threat data, and in-depth perspectives. Today's top story: "Fuzzing 101: Why Bug-Finders Still Love It After All These Years."