Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Edge Articles

9/16/2019
02:30 PM
Terry Sweeney
Terry Sweeney
Edge Features
Connect Directly
Facebook
Twitter
RSS
E-Mail
50%
50%

How a PIA Can CYA

More than a compliance mandate, privacy impact assessments can also spot risks early in the product development cycle.

Image: adiruch na chiangmai via Adobe Stock
Image: adiruch na chiangmai via Adobe Stock

Privacy impact assessments (PIAs) purport to examine privacy protections, but they serendipitously turn up deeper insights into an organization's risk exposure, improving governance and the organization's overall security posture, experts say.

More than a diagnostic tool or compliance checklist, PIAs are essentially templated questionnaires that help organizations identify their privacy risks are with information they collect, use, or store, says Rebecca Herold, CEO of the Privacy Professor, a security consultancy. PIA templates typically have some combination of multiple choice and open-ended questions. While often administered quarterly, PIAs can be done more frequently or after a breach or suspicious incident.

But mostly, PIAs help expose potential privacy issues that may get overlooked in the rush to market. Herold recalls an organization she worked with that developed a saliva test to detect concussions. Unlike doctors and hospitals that are subject to federal privacy protections, this organization was HIPAA-exempt and hadn't really thought through the ramifications of the data it wanted to collect. 

Not surprisingly, consumers became concerned about who'd be able to access the saliva test results. "The original intent was good, but it was unclear who was getting the data – colleges, employers, insurance companies," Herold explains.

PIAs help ensure such concerns, and vulnerabilities get addressed sooner in the product development cycle. And with technologies like artificial intelligence and the Internet of Things in ascendance – and the market pressure to roll out products before they've been fully vetted – PIAs can smooth the introduction of that teddy bear with the camera in it, for example.

In that respect, PIAs not only offer a clearer view of where privacy risks are, but they also help organizations understand how personal information they collect is being used or transmitted, as well as where potential vulnerabilities are, Herold explains. And all that is enormously helpful as organizations measure their risks, then move to mitigate them, she adds.

Some industry sectors require regular PIAs. All federal agencies, for example, must perform annual PIAs for their systems, which are then posted online. At the invitation of the National Institute of Standards and Technology (NIST), Herold herself performed a PIA on the agency's plan for smart grids in 2009. 

The EU's General Data Protection Regulation (GDPR) also has a provision for something called a data protection impact assessment (DPIA), which is a type of PIA, Herold says. And any organization with access to personal data of EU citizens is legally required to perform a DPIA.

In addition to demonstrating an organization's due diligence, PIAs can be a market differentiator in highly competitive fields, Herold explains. "I wish more application and device companies would do PIAs right at the start of their engineering plans because it would eliminate and mitigate many problems we have with privacy breaches," she says.

Medical device manufacturers, in particular, get hyper-focused on a device's basic functionality. "I'm always trying to get them to realize they need to go beyond getting it to work consistently," Herold says.

Another sector with ongoing security challenges is education. School environments are historically hard to secure because they're open areas, Herold says. And laptops that monitor students are just the tip of the privacy iceberg. "More educators are addressing privacy because there are so many gadgets that tech companies want to provide schools," she adds.

Factor in cloud-based learning apps, the proliferation of student smartphones, and social networks, and you have a privacy quagmire. 

"Any process that's completely automated and had no human intervention is going to have some pieces missing with regard to privacy," Herold says. "You need critical thinking," and PIAs help encourage that objective analysis.

Related Content:

 

Terry Sweeney is a Los Angeles-based writer and editor who has covered technology, networking, and security for more than 20 years. He was part of the team that started Dark Reading and has been a contributor to The Washington Post, Crain's New York Business, Red Herring, ... View Full Bio

Comment  | 
Print  | 
More Insights
Comments
Threaded  |  Newest First  |  Oldest First
The Edge Cartoon Contest: Need a Lift?
Flash Poll