Privacy is in for a turbulent 2021, with companies facing more privacy regulations, continued attempts to create backdoors in encrypted communications, and the introduction of a variety of privacy-focused technologies.
In October, for example, the US Department of Justice (DoJ) and its allies signed a letter calling for technological solutions to give law enforcement access specific communications, a move privacy advocates consider a threat. In a more pro-privacy move, as of Dec. 8, Apple will require developers to disclose all the data its apps collect from users, including data collected by third-party advertising frameworks that are included or linked to in the code. Both initiatives will have potential impacts on privacy in 2021.
The rapid changes in the privacy landscape make the topic a perennial concern for security and privacy teams, says Darren Van Booven, lead principal consultant with security-services firm Trustwave.
"Privacy is a topic that keeps coming up in conversations we have with security teams and privacy teams," he says. "It used to be one of those thing that we had to bring up in the conversation, but now our clients are the ones to bring it up more and more."
The next few years will force companies to re-evaluate how they approach privacy, as privacy expectations and regulations are changing quickly. The California Consumer Privacy Act (CCPA), which passed almost a year ago and became enforceable in July, will likely have its first fines in 2021. The General Data Protection Regulation (GDPR) has already caused companies to pay hundreds of millions of dollars in penalties for shoddy or unethical privacy practices.
Yet the changes are not just about presenting a regulatory stick. Companies are becoming more mature with their privacy practices and more focused on pursuing customer-friendly privacy policies because of consumer pressures, says Heidi Shey, principal analyst for security and risk at Forrester Research.
"Consumers may not pay attention if there is news of a data breach that was the result of a security incident. There is a greater willingness to forgive those types of things," she says. "But if your company makes the news because of an unethical practice, or you are using data in the way that people did not expect, then they will have concerns about how else you are using their data."
A Varied Landscape of Privacy Regulations
One of the main trends in coming years will be the addition of state privacy regulations, modeled on the CCPA. Penalties range from $2,500 for each violation to $7,500 for each intentional violation if a company does not correct its privacy practices in 30 days.
The addition of new laws means companies need to be aware of the various regulatory frameworks that are now in force. In May 2018, European information commissioners began enforcing the GDPR, which has already led to some significant fines, including a nearly $244 million penalty for British Airways, a $131 million levy on Marriot International Hotels, and a $59 million fine for Google.
Not only do companies with California or European consumers have to abide by those laws, but at least 15 other states have created similar legislation. The results will cause complications for US companies, says Trustwave's Van Booven.
"Looking forward, we have a number of different draft privacy requirements in bill form in different states, and they all look different," he says. "Adhering to 50 different privacy requirements makes planning difficult. A lot of companies are frustrated by it."
Debate Will Continue Over Encryption Backdoors
Along with more privacy-focused regulations, some governments' efforts to undo pro-privacy encryption continue to grow. The debate over ways for governments to lawfully gain access to encryption communications — often referred to as a "backdoor" — have continued in 2020 and will likely become a public fight (again) in 2021. Often linked by advocates to efforts to fight child abuse and terrorism, the efforts against end-to-end encryption have played out many times since the 1990s.
The European Union is currently considering "a laundry list of tortuous ways to achieve the impossible: allowing government access to encrypted data, without somehow breaking encryption," including ways to monitor speech using a tool on the client-side device, according to the Electronic Frontier Foundation. The EU's Counter Terrorism Coordinator has called for the bloc to pursue a "front door" approach and engage in the public debate to require private companies to come up with solutions.
In October, the DoJ released a joint statement with other members of the Five Eyes Alliance — Australia, Canada, New Zealand, and the United Kingdom, which share intelligence on threats — supporting methods of accessing encrypted communications.
"[W]hile encryption is vital and privacy and cyber security must be protected, that should not come at the expense of wholly precluding law enforcement, and the tech industry itself, from being able to act against the most serious illegal content and activity online," the statement said.
More Focus on Unintended Uses and AI
Outside of the perennial debate over encryption backdoors, new technological threats to privacy continue to emerge. Deepfake videos use publicly accessible images of people and deep neural networks to create videos of people doing and saying things they never did. Machine-learning and artificial-intelligence researchers regularly scrape data from the Internet to create systems that many people believe violate their privacy, such as ClearviewAI's ability to use online information to match an image of people to all of their public information.
These unintended uses of publicly-available information have opened a new front in the battle for privacy and consumer control of their data. Consumers may give up their data for one particular use case — such as an online profile — but then find out the image is being used as a large dataset that businesses are using for to violate their privacy.
This is changing the way we think about privacy because there is a consent angle that we have not really thought about, says Davi Ottenheimer, vice president of trust and digital ethics at Inrupt, a startup developing pro-privacy data systems for the Web.
"Just because someone has uploaded their image to the Internet, many businesses think that 'public' means 'consent,' when it does not," he says. "Just because someone makes something public does not mean you get to use it however you want, which is pretty well understood in terms of copyright, but not in terms of privacy."
Technology to the Rescue?
A variety of technologies are attempting to help users gain some ground in the privacy battle. Solid, a technology from the Massachusetts Institute of Technology and web creator Tim Berners-Lee, aims to give users more control over how their data on the Web is accessed. The private company creating solutions for the open source specification announced four pilots in November for major European clients that cloud lead to greater adoption.
For companies, promising technologies could protect their secrets and privacy. Another group at MIT announced the Secure Cyber Risk Aggregation and Measurement (SCRAM) system for sharing breach data anonymously. The technology uses special encryption to preserve values and allow calculations on breach losses without ever revealing the victim's information.
Finally, companies that have begun to monitor their remote workers should evaluate whether their technological approach violates any privacy norms. While many companies have adopted workplace monitoring software — especially as a reaction to a more distributed workforce — they will inevitably go too far, says Forrester's Shey.
"Employee privacy will become a big issue next year," she says. "When many companies think about privacy regulations, they almost always think about customers, not employees."
That will change, she says.