Survey bias is creating misunderstanding of our feelings on online privacy, Google says. One privacy advocate weighs in.

Thomas Claburn, Editor at Large, Enterprise Mobility

June 23, 2011

4 Min Read

Top 15 Google Apps For Business

Top 15 Google Apps For Business


Slideshow: Top 15 Google Apps ForBusiness (click image for larger view and for full slideshow)

At Google, many employees feel that the company has been unfairly characterized as disinterested or insincere in its efforts to protect user privacy. Google insiders, like insiders at any company, understand their own motivations and innovations better than outsiders, who often have their own agenda. They see missteps where others see malice.

There have been missteps, to be sure, from the company's short-lived stance in 2008 that it did not need to include a privacy policy link on its home page, despite California state law, to its mismanaged launch of Buzz and its inadvertent vacuuming of Wi-Fi packet data through Street View cars last year. But only the most hardened Google haters cite such incidents as the sort of evil that the company says it strives not to do.

The trouble for Google is that it has made many enemies in its meteoric rise--many have legitimate reasons to resent a competitor that plays harder than the public generally perceives. And now Google's surfeit of success has come back to haunt it, with the Federal Trade Commission reportedly on the verge of launching a formal antitrust investigation, with competitors backing a Do-Not-Track standard that would limit the data collection that drives Google's ad revenue, and with related privacy pressure, among other challenges.

Google has responded to privacy concerns by taking such steps as appointing a new privacy director for products last October and, more recently, rolling out a dashboard tool called Me on the Web to help users monitor what's said about them online. The company is also conducting research to help it reframe the privacy debate.

In a paper to be featured at the forthcoming Symposium on Usable Privacy and Security--held July 20-22, 2011 at Carnegie Mellon University--three Google researchers have found that privacy surveys tend to make people fear for their privacy. And they propose a way to conduct such surveys indirectly, so that questions related to privacy don't provoke an emotional response.

As an example, the study's authors found that the number of users willing to share most or all of their online purchase records with close friends or family declined 41% when survey questions included privacy and security language.

The paper, "Indirect Content Privacy Surveys: Measuring Privacy Without Asking About It," concludes that "privacy survey wording strongly impacts responses by increasing user reports of privacy concern both with respect to relatively innocuous content types (e.g. news articles) as well as content that contains personal information (e.g. purchase records)."

In short, asking people whether they're worried about privacy risks makes them worried about privacy risks.

Lauren Weinstein, co-founder of People For Internet Responsibility and founder of Privacy Forum, said in a phone interview that this isn't surprising. "Survey bias issues are fundamental and go back as far as surveys," he said.

While some may write off Google's research as self-serving--eliminating emotion from the privacy debate would likely diminish unease about Google's information gathering--Weinstein argues that such research is entirely legitimate, irrespective of what one might infer about the company's motives. He said what's important is to better understand what users really want. The absence of such understanding, he suggested, leads to ill-conceived initiatives like Do-Not-Track, which he considers to be too difficult to implement and too draconian.

"I think that we are only really starting to get a handle on the very basic aspects of [online privacy], beyond what has been primarily an emotional angle up to this point," he said.

Weinstein concedes that we may not be able to completely remove emotion from the privacy debate but insists there's still value in trying to find more scientific ways to gauge what people really want and to translate those desires into functional settings online.

A Google representative was not immediately available for comment.

In the new, all-digital Dark Reading supplement: What industry can teach government about IT innovation and efficiency. Also in this issue: Federal agencies have to shift from annual IT security assessments to continuous monitoring of their risks. Download it now. (Free registration required.)

About the Author(s)

Thomas Claburn

Editor at Large, Enterprise Mobility

Thomas Claburn has been writing about business and technology since 1996, for publications such as New Architect, PC Computing, InformationWeek, Salon, Wired, and Ziff Davis Smart Business. Before that, he worked in film and television, having earned a not particularly useful master's degree in film production. He wrote the original treatment for 3DO's Killing Time, a short story that appeared in On Spec, and the screenplay for an independent film called The Hanged Man, which he would later direct. He's the author of a science fiction novel, Reflecting Fires, and a sadly neglected blog, Lot 49. His iPhone game, Blocfall, is available through the iTunes App Store. His wife is a talented jazz singer; he does not sing, which is for the best.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights