Biometrics Regulation Heats Up, Portending Compliance Headaches
A growing thicket of privacy laws regulating biometrics is aimed at protecting consumers amid increasing cloud breaches and AI-created deepfakes. But for businesses that handle biometric data, staying compliant is easier said than done.
This year might be a boon for biometric privacy legislation. The topic is heating up and lies at the intersection of four trends: increasing artificial intelligence (AI)-based threats, growing biometric usage by businesses, anticipated new state-level privacy legislation, and a new executive order issued by President Biden this week that includes biometric privacy protections.
But the increased scrutiny could backfire: These regulations could create conflicts and complex governance issues for corporations trying to meet the new strictures, especially when it comes to a tranche of new state-based laws that are set to go into effect. This means businesses need to stay current as this legal landscape evolves.
Amy de La Lama — a lawyer with Bryan Cave Leighton Paisner, which tracks state privacy laws — says that businesses need to be more forward-looking and anticipate and understand the risks in order to build the appropriate infrastructure to track and use biometric content.
"This means they should work more closely between their business and legal functions to understand how to use biometrics in their products and services and to understand the legal requirements," she says.
Biometrics Regulation Lags State Privacy Efforts
Various states have enacted data privacy laws in the past two years, including Delaware, Indiana, Iowa, Montana, New Jersey, Oregon, Tennessee, and Texas. This adds to privacy laws already enacted in California, Colorado, Connecticut, Utah, and Virginia.
Yet despite this growing envelope of privacy protection, not all states have done much in the way of regulating biometrics. For example, Colorado's privacy laws don't explicitly define biometric data but have regulations about how it's processed.
Meanwhile, five states specifically have passed biometrics-related regulations (Illinois, Maryland, New York, Texas, and Washington). While that sounds like a trend, many of these laws are limited, such as the New York law that is solely focused on prohibitions in collecting fingerprints from employees as a condition for employment.
Of the five existing states with biometrics-related statutes, Illinois' Biometric Information Privacy Act has been around the longest — since 2008 — and is the most comprehensive, covering how biometric data is collected, stored, and used. Yet it took until this week to settle on damages in the lawsuit brought by a group of truck drivers against the BNSF railroad several years ago over a requirement to scan their fingerprints before entering an Illinois rail yard.
Things could be changing: New York is considering at least three bills this year that attempt to expand protections to more comprehensive biometric controls, and there are bills in at least 14 other states' committees for a wider interpretation of biometric matters as well.
A Confusing Patchwork of Data Compliance Requirements
The subtle differences between all the state laws can cause compliance conflicts. There are differences between how biometric privacy will be regulated, as well as overall enactment dates and differing reporting requirements.
"Biometrics is clearly in the crosshairs right now," says David Stauss, a leading expert at the law firm Husch Blackwell, which tracks privacy laws across the country, "and it is at the top of the list of managing sensitive data concerns. It is incredibly difficult for companies to track all these requirements. These regulations are a constantly moving target, and akin to building a ship as we sail it."
For example, Texas' and Montana's privacy laws go into effect on July 1, but Indiana's laws won't take effect until Jan. 1, 2026. California's laws create new requirements for sensitive personal information and allows consumers to limit certain data that can be used by businesses. Virginia's law has a more restrictive definition of biometric data and limits how it can be processed.
Also, each state has a different mixture of which businesses have to report, based on how much revenue is generated in each state, the number of consumers affected, and whether they are for-profit or not.
What this all means is that it will be complicated for companies doing business nationally because they will have to audit their data protection procedures and understand how they obtain consumer consent or allow consumers to restrict the use of such data and make sure they match the different subtleties in the regulations.
Contributing to the compliance headaches: The executive order sets high goals for various federal agencies in how to regulate biometric information, but there could be confusion in terms of how these regulations are interpreted by businesses. For example, does a hospital's use of biometrics fall under rules from the Food and Drug Administration, Health and Human Services, the Cybersecurity and Infrastructure Security Agency, or the Justice Department? Probably all four.
And that's before even considering the international implications, because Europe and other places are adding to this crazy quilt of privacy rules.
Biometrics Use Expands, Despite Trust Problems
This complex legal landscape is driven by the growing use of biometrics to protect private and business data — and the cybersecurity threats that come with that.
Vendors are doing a better job at incorporating these technologies into overall software development packages, such as the announcement last fall that Amazon will expand its palm-scanning One software to enable better enterprise access controls.
But while fingerprint-, face-, and palm-scanning technologies have been around for years (the FBI has collected many millions of palm scans for the past decade), Amazon is storing its palm prints in the cloud, which could make any leaks or potential abuses more likely, according to Mark Hurst, Creative Good's CEO.
"These palm readers are intended to normalize the act of giving up your biometric data anywhere, any time," Hurst says. "And what happens if the palm data — like so many other ID systems — gets hacked? Good luck finding a new palm."
Meanwhile, AI-induced deepfake video impersonations by criminals that abuse biometric data like face scans are on the rise. Earlier this year, a deepfake attack in Hong Kong was used to steal more than $25 million, and there are certainly others who will follow as AI technology gets better and easier to use for producing biometric fakes.
The conflicting regulations and criminal abuses could explain why consumer confidence in biometrics has taken a nosedive.
According to GetApp's 2024 Biometric Technologies Survey of 1,000 consumers, the number of individuals who highly trust tech companies to safeguard biometric data has fallen from 28% in 2022 to just 5% in 2024. The company says the drop is due to the increasing number of data breaches and reports of identity theft cases.
"To mitigate legal, reputational, and financial risks, ensure that biometric data is captured with consent and stored securely in compliance with privacy regulations," says Zach Capers, senior security analyst at GetApp. But that might be easier said than done, especially as future biometric laws offer conflicting requirements.
About the Author
You May Also Like
DevSecOps/AWS
Oct 17, 2024Social Engineering: New Tricks, New Threats, New Defenses
Oct 23, 202410 Emerging Vulnerabilities Every Enterprise Should Know
Oct 30, 2024Simplify Data Security with Automation
Oct 31, 2024