Cybersecurity In-Depth: Feature articles on security strategy, latest trends, and people to know.

More than a compliance mandate, privacy impact assessments can also spot risks early in the product development cycle.

3 Min Read
Image: <a href="https://stock.adobe.com/contributor/206004618/adiruch-na-chiangmai"target="_blank">adiruch na chiangmai</a> via Adobe Stock

Privacy impact assessments (PIAs) purport to examine privacy protections, but they serendipitously turn up deeper insights into an organization's risk exposure, improving governance and the organization's overall security posture, experts say.

More than a diagnostic tool or compliance checklist, PIAs are essentially templated questionnaires that help organizations identify their privacy risks are with information they collect, use, or store, says Rebecca Herold, CEO of the Privacy Professor, a security consultancy. PIA templates typically have some combination of multiple choice and open-ended questions. While often administered quarterly, PIAs can be done more frequently or after a breach or suspicious incident.

But mostly, PIAs help expose potential privacy issues that may get overlooked in the rush to market. Herold recalls an organization she worked with that developed a saliva test to detect concussions. Unlike doctors and hospitals that are subject to federal privacy protections, this organization was HIPAA-exempt and hadn't really thought through the ramifications of the data it wanted to collect. 

Not surprisingly, consumers became concerned about who'd be able to access the saliva test results. "The original intent was good, but it was unclear who was getting the data – colleges, employers, insurance companies," Herold explains.

PIAs help ensure such concerns, and vulnerabilities get addressed sooner in the product development cycle. And with technologies like artificial intelligence and the Internet of Things in ascendance – and the market pressure to roll out products before they've been fully vetted – PIAs can smooth the introduction of that teddy bear with the camera in it, for example.

In that respect, PIAs not only offer a clearer view of where privacy risks are, but they also help organizations understand how personal information they collect is being used or transmitted, as well as where potential vulnerabilities are, Herold explains. And all that is enormously helpful as organizations measure their risks, then move to mitigate them, she adds.

Some industry sectors require regular PIAs. All federal agencies, for example, must perform annual PIAs for their systems, which are then posted online. At the invitation of the National Institute of Standards and Technology (NIST), Herold herself performed a PIA on the agency's plan for smart grids in 2009. 

The EU's General Data Protection Regulation (GDPR) also has a provision for something called a data protection impact assessment (DPIA), which is a type of PIA, Herold says. And any organization with access to personal data of EU citizens is legally required to perform a DPIA.

In addition to demonstrating an organization's due diligence, PIAs can be a market differentiator in highly competitive fields, Herold explains. "I wish more application and device companies would do PIAs right at the start of their engineering plans because it would eliminate and mitigate many problems we have with privacy breaches," she says.

Medical device manufacturers, in particular, get hyper-focused on a device's basic functionality. "I'm always trying to get them to realize they need to go beyond getting it to work consistently," Herold says.

Another sector with ongoing security challenges is education. School environments are historically hard to secure because they're open areas, Herold says. And laptops that monitor students are just the tip of the privacy iceberg. "More educators are addressing privacy because there are so many gadgets that tech companies want to provide schools," she adds.

Factor in cloud-based learning apps, the proliferation of student smartphones, and social networks, and you have a privacy quagmire. 

"Any process that's completely automated and had no human intervention is going to have some pieces missing with regard to privacy," Herold says. "You need critical thinking," and PIAs help encourage that objective analysis.

Related Content:

 

About the Author(s)

Terry Sweeney, Contributing Editor

Terry Sweeney is a Los Angeles-based writer and editor who has covered technology, networking, and security for more than 20 years. He was part of the team that started Dark Reading and has been a contributor to The Washington Post, Crain's New York Business, Red Herring, Network World, InformationWeek and Mobile Sports Report.

In addition to information security, Sweeney has written extensively about cloud computing, wireless technologies, storage networking, and analytics. After watching successive waves of technological advancement, he still prefers to chronicle the actual application of these breakthroughs by businesses and public sector organizations.


Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights