12:12 PM

Google Street View Pursued Wardriving By Design

FCC slaps Google with a $25,000 fine for obstructing its investigation, but finds no laws broken when Google sniffed unencrypted Wi-Fi data, including usernames and passwords.

Google Drive: 10 Alternatives To See
Google Drive: 10 Alternatives To See
(click image for larger view and for slideshow)

Google Street View collected unencrypted data--including usernames and passwords--from Wi-Fi hotspots around the world by design.

That fact was literally blacked out when the Federal Communications Commission earlier this month released a copy of its Street View investigation report with numerous redactions. But the Electronic Privacy Information Center and other privacy rights groups had been pressing the FCC to release its report in non-redacted form. Those demands led Google to release a version that redacts only people's names.

"We decided to voluntarily make the entire document available except for the names of individuals," said Google spokeswoman Niki Fenwick in an emailed statement. "While we disagree with some of the statements made in the document, we agree with the FCC's conclusion that we did not break the law. We hope that we can now put this matter behind us."

The more complete FCC report, however, raises numerous questions, such as why a Google engineer who was working on Street View only part-time (as part of Google's practice of allowing its employees to spend 20% of their time working on other projects) was allowed to make global data-interception decisions that apparently faced no legal review and was rubberstamped by managers.

[ Google's new file-storage service is generating plenty of buzz from privacy advocates. Should you be concerned? Google Drive Privacy: 4 Misunderstood Facts. ]

"The data collection resulted from a deliberate software design decision by one of the Google employees working on the Street View project," read the report. In particular, a design document created by a Google employee--named only as "Engineer Doe"--detailed the usefulness of using the Street View cars for "wardriving," according to the report. Wardriving refers to the practice of driving around looking for accessible wireless networks or wireless data traffic, then sniffing and storing the data they're sending and receiving.

The FCC said it interviewed five Google employees during the course of its investigation, as well as an employee at consulting firm Stroz Friedberg, which Google hired to review its Street View program source code. But according to the FCC, in response to a subpoena to provide a deposition, Engineer Doe through counsel had "invoked his Fifth Amendment right against self-incrimination and declined to testify."

Google has long argued that its Street View data-collection practices were legal. Furthermore, the Department of Justice, after its own investigation into the matter, chose in May 2011 to not prosecute Google for violating the Wiretap Act. Likewise, the Federal Trade Commission wrapped its related investigation in October 2010, and the next month, the FCC launched its own investigation.

The FCC's report, however, uses more circumspect language than Google to describe the Street View data collection practices. Notably, the agency said that there wasn't "clear precedent" for its applying the Communications Act of 1934 to cover Wi-Fi communications, and that regardless, without Engineer Doe's testimony, "significant factual questions" remained unanswered. But the agency did hit Google with a $25,000 fine for obstructing its investigation--a charge that Google has denied.

Questions over Street View began surfacing in early 2010, when European regulators asked Google to detail exactly what types of data the company was collecting with its Street View cars. In April 2010, Google said that it was collecting Wi-Fi network information, and specifically, "SSID data (i.e. the network name) and MAC address (a unique number given to a device like a WiFi router)," but no other data, according to a blog post from Peter Fleischer, Google's Global Privacy Counsel. "Networks also send information to other computers that are using the network, called payload data, but Google does not collect or store payload data," he said.

In May 2010, however, the company admitted that owing to a mistake, it had in fact been collecting payload data that wasn't password-protected, but that the data collection was incidental, fleeting, and often fragmented. "Quite simply, it was a mistake," read a Google blog post. But according to the FCC's report, Google's payload data collection was by design.

Google quickly pulled the plug on such practices when they came to light. According to the FCC, a report from Stroz Friedberg said that as of May 6, 2010, it had verified that Google had stopped capturing payload data. By July, Google said that it had also removed all Wi-Fi sniffing equipment from its Street View cars.

By October 2010, meanwhile, Google admitted that "in some instances entire emails and URLs were captured, as well as passwords." But Google said it had discarded all such data and that it was putting new privacy controls in place to prevent a repeat scenario.

Rob VandenBrink, a senior consulting engineer at Metafore, said on the Internet Storm Center diary that privacy alarm bells should have been ringing from the start of the wardriving program, although in an engineering aside, he did applaud Google for at least using a well-designed, off-the-shelf tool--Kismet--to handle packet capture. "It was sensible that the engineer didn't go write a new tool for this--they used Kismet to collect the data, then massaged Kismet's output during their later analysis," he said. "Aside from the fact that anyone who's been in almost any SANS class would realize how wrong using the tool was, at least they didn't go write something from scratch."

The FCC's Street View report should serve as a lesson to any company pursuing technology projects that could lead to privacy or other legal questions: engineers aren't legal experts. "Long story short, this document outlines how the manager(s) of the project trusted the engineer's word on the legal implications of their activity," VandenBrink said.

Too often, in fact, businesses fail to properly assess the potential legal implications of their technology projects. Many companies simply "take their best shot at the 'do the right thing" decision,'" he said. "As you can imagine, if the results of a decision like this ever [come] back to see the light of day, it seldom ends well."

Still, how did a technology company of Google's caliber fail to head off the resulting Street View privacy debacle? "In Google's case, they have a legal department on staff, and I'd imagine that one of their primary directives is to keep an eye on privacy legislation, regulations and compliance to said legislation," he said. "Though you can't fault the legal team if the question never gets directed their way."

InformationWeek is conducting a survey to get a baseline look at where enterprises stand on their IPv6 deployments, with a focus on problem areas, including security, training, budget, and readiness. Upon completion of our survey, you will be eligible to enter a drawing to receive an 16-GB Apple iPad. Take our InformationWeek IPv6 Survey now. Survey ends May 11.

Comment  | 
Print  | 
More Insights
Oldest First  |  Newest First  |  Threaded View
User Rank: Apprentice
4/30/2012 | 4:58:27 PM
re: Google Street View Pursued Wardriving By Design
Do no evil....
Register for Dark Reading Newsletters
White Papers
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: This comment is waiting for review by our moderators.
Current Issue
Security Operations and IT Operations: Finding the Path to Collaboration
A wide gulf has emerged between SOC and NOC teams that's keeping both of them from assuring the confidentiality, integrity, and availability of IT systems. Here's how experts think it should be bridged.
Flash Poll
New Best Practices for Secure App Development
New Best Practices for Secure App Development
The transition from DevOps to SecDevOps is combining with the move toward cloud computing to create new challenges - and new opportunities - for the information security team. Download this report, to learn about the new best practices for secure application development.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
Published: 2015-10-15
The Direct Rendering Manager (DRM) subsystem in the Linux kernel through 4.x mishandles requests for Graphics Execution Manager (GEM) objects, which allows context-dependent attackers to cause a denial of service (memory consumption) via an application that processes graphics data, as demonstrated b...

Published: 2015-10-15
netstat in IBM AIX 5.3, 6.1, and 7.1 and VIOS 2.2.x, when a fibre channel adapter is used, allows local users to gain privileges via unspecified vectors.

Published: 2015-10-15
Cross-site request forgery (CSRF) vulnerability in eXtplorer before 2.1.8 allows remote attackers to hijack the authentication of arbitrary users for requests that execute PHP code.

Published: 2015-10-15
Directory traversal vulnerability in QNAP QTS before 4.1.4 build 0910 and 4.2.x before 4.2.0 RC2 build 0910, when AFP is enabled, allows remote attackers to read or write to arbitrary files by leveraging access to an OS X (1) user or (2) guest account.

Published: 2015-10-15
Cisco Application Policy Infrastructure Controller (APIC) 1.1j allows local users to gain privileges via vectors involving addition of an SSH key, aka Bug ID CSCuw46076.

Dark Reading Radio
Archived Dark Reading Radio
In past years, security researchers have discovered ways to hack cars, medical devices, automated teller machines, and many other targets. Dark Reading Executive Editor Kelly Jackson Higgins hosts researcher Samy Kamkar and Levi Gundert, vice president of threat intelligence at Recorded Future, to discuss some of 2016's most unusual and creative hacks by white hats, and what these new vulnerabilities might mean for the coming year.