New iOS privacy features require developers to disclose what data they're collecting, how they're using it, and with whom they share it.

Heather Federman, VP of Privacy & Policy at BigID

August 13, 2020

5 Min Read

In 2012, the National Telecommunications and Information Administration (NTIA) convened a series of meetings that were intended to develop a legally enforceable code of conduct to provide transparency in how companies providing applications and interactive services for mobile devices handle personal information. This multistakeholder process sought input from companies, researchers, advocates, trade groups, and the like.

One of the initial proposals for a code of conduct came from a group of Carnegie Mellon researchers at the Cylab Usable Privacy and Security Lab and a security researcher at Microsoft, who had released a paper in 2009 that promoted the idea of a "privacy nutrition label" as a de facto standard to be used by all app developers.

The process ended in the spring of 2013 with a group of think tanks, trade organizations, advocates, and companies signing on to the finalized code of conduct. But in the long run, this went nowhere. A voluntary code of conduct that was meant for app developers to leverage as a means to provide transparency through short form notices in their mobile apps was barely touched upon by the app developer community.

Almost seven years later, Apple has achieved what we could not: A privacy nutrition label. The company announced at its 2020 WWDC last month new iOS privacy features requiring app developers on their platform to disclose in clear language what data they are collecting, how they're using the data, and who they are sharing it with — basically, any data that is linked to a user and is being used for ad tracking. And the apps must get users' opt-in consent. This is akin to a nutrition label that will help consumers make informed decisions about whether they want to download an app.

With one software update, Apple has been able to force 1.85 million apps to reveal their privacy practices in a standardized iconographic form. This is testament to the power of the tech giant, which has about 1.5 billion devices in the market. In other words, Apple is setting the mobile privacy standard, not a governmental body or multistakeholder voluntary process.

Apple's new iOS privacy features are already drawing industry ire. More than a dozen digital ad groups in Europe, including ones backed by Google and Facebook, have complained that app providers who want to track users across apps will now have to get consent from consumers twice, increasing the likelihood that users will opt out. The European Union's General Data Protection Regulation (GDPR) already requires them to get user permission to collect data for marketing purposes. And now Apple will be forcing apps to get consent for ad targeting instead of allowing it by default.

Apple's use of the word "tracking" could be seen as a direct assault on advertising providers. Consumers will first have to opt in to ad tracking and they'll know exactly what data is being used and how. When an app tries to access the device's unique identification number for advertisers, a message will pop up that says the app "would like permission to track you across apps and websites owned by other companies."

The company also has made it much harder for advertisers to target users based on location. Now, apps will only be able to detect a user's location within 10 square miles instead of a more granular, precise location based on GPS. Location-based tracking is typically used to help marketers understand user behaviors so they can more effectively target them with location-based ads. While people may have resigned themselves to targeting based on website visits, they are increasingly concerned about being tracked by their whereabouts. Only one-third of US smartphone users said in a recent survey that they were comfortable sharing location information for marketing purposes.

Keep in mind that developers will have to self-report their data practices for the new nutrition label. Self-reporting privacy certification programs already have a questionable reputation, and most recently with Europe's invalidation of the US government-run "Privacy Shield" program. Plus, mobile apps already have a history of poor privacy practices and misleading users. For these nutrition labels to be effective, then, Apple must be clear about how it will verify and enforce that the information developers provide is accurate, complete, and up to date. Given that its App Store is already carefully vetted for security issues, this shouldn't be too arduous for them to handle.

This move by Apple to plant a stake in the ground on behalf of privacy may have far reaching consequences. As scholar and author, Woodrow Hartzog argues in Privacy's Blueprint: The Battle to Control the Design of New Technologies: "Design is power. Design is political. People do not use technologies for whatever tasks or goals they wish to accomplish. Instead, technologies shape how people use them. Technologies shape users' behavior, choice, even attitudes." The iOS changes may raise privacy awareness among consumers who previously didn't think about the information their apps were collecting about them. It will also force advertisers to adopt new business models that aren't totally reliant on knowing user behavior.

In addition, this could set a strong example for other tech providers and could make privacy the new normal. Some of the same researchers from Carnegie Mellon University who proposed the mobile app nutrition label over a decade ago recently proposed a standardized privacy and security label for Internet of Things devices. Apple's user interface and design decisions have been known to lead to sea changes throughout the tech hardware and software industry. When it comes to privacy, hopefully this change won't be an exception.

Related Content:

About the Author(s)

Heather Federman

VP of Privacy & Policy at BigID

Heather Federman is the VP of Privacy & Policy at BigID, where she manages and leads initiatives related to privacy evangelism, product innovation, internal compliance and industry collaboration. Prior to BigID, Heather was the Director of Privacy & Data Risk at Macy's Inc., managing policies, programs, communications and training. Heather was also the Senior Privacy Manager at American Express Co., focusing on AMEX's Global Brand, Marketing & Digital Partnerships. She previously served as a Legal and Policy Fellow for the Future of Privacy Forum (FPF) and as the Public Policy Director for the Online Trust Alliance (OTA), working to further FPF's mission in advancing responsible data practices and OTA's mission in establishing trust in the online ecosystem.

Heather received her B.A. from the Gallatin School of Individualized Study at New York University and her J.D. from Brooklyn Law School, and is a Certified Information Privacy Professional (CIPP/US).

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights