12:12 PM

Google Street View Pursued Wardriving By Design

FCC slaps Google with a $25,000 fine for obstructing its investigation, but finds no laws broken when Google sniffed unencrypted Wi-Fi data, including usernames and passwords.

Google Drive: 10 Alternatives To See
Google Drive: 10 Alternatives To See
(click image for larger view and for slideshow)

Google Street View collected unencrypted data--including usernames and passwords--from Wi-Fi hotspots around the world by design.

That fact was literally blacked out when the Federal Communications Commission earlier this month released a copy of its Street View investigation report with numerous redactions. But the Electronic Privacy Information Center and other privacy rights groups had been pressing the FCC to release its report in non-redacted form. Those demands led Google to release a version that redacts only people's names.

"We decided to voluntarily make the entire document available except for the names of individuals," said Google spokeswoman Niki Fenwick in an emailed statement. "While we disagree with some of the statements made in the document, we agree with the FCC's conclusion that we did not break the law. We hope that we can now put this matter behind us."

The more complete FCC report, however, raises numerous questions, such as why a Google engineer who was working on Street View only part-time (as part of Google's practice of allowing its employees to spend 20% of their time working on other projects) was allowed to make global data-interception decisions that apparently faced no legal review and was rubberstamped by managers.

[ Google's new file-storage service is generating plenty of buzz from privacy advocates. Should you be concerned? Google Drive Privacy: 4 Misunderstood Facts. ]

"The data collection resulted from a deliberate software design decision by one of the Google employees working on the Street View project," read the report. In particular, a design document created by a Google employee--named only as "Engineer Doe"--detailed the usefulness of using the Street View cars for "wardriving," according to the report. Wardriving refers to the practice of driving around looking for accessible wireless networks or wireless data traffic, then sniffing and storing the data they're sending and receiving.

The FCC said it interviewed five Google employees during the course of its investigation, as well as an employee at consulting firm Stroz Friedberg, which Google hired to review its Street View program source code. But according to the FCC, in response to a subpoena to provide a deposition, Engineer Doe through counsel had "invoked his Fifth Amendment right against self-incrimination and declined to testify."

Google has long argued that its Street View data-collection practices were legal. Furthermore, the Department of Justice, after its own investigation into the matter, chose in May 2011 to not prosecute Google for violating the Wiretap Act. Likewise, the Federal Trade Commission wrapped its related investigation in October 2010, and the next month, the FCC launched its own investigation.

The FCC's report, however, uses more circumspect language than Google to describe the Street View data collection practices. Notably, the agency said that there wasn't "clear precedent" for its applying the Communications Act of 1934 to cover Wi-Fi communications, and that regardless, without Engineer Doe's testimony, "significant factual questions" remained unanswered. But the agency did hit Google with a $25,000 fine for obstructing its investigation--a charge that Google has denied.

Questions over Street View began surfacing in early 2010, when European regulators asked Google to detail exactly what types of data the company was collecting with its Street View cars. In April 2010, Google said that it was collecting Wi-Fi network information, and specifically, "SSID data (i.e. the network name) and MAC address (a unique number given to a device like a WiFi router)," but no other data, according to a blog post from Peter Fleischer, Google's Global Privacy Counsel. "Networks also send information to other computers that are using the network, called payload data, but Google does not collect or store payload data," he said.

In May 2010, however, the company admitted that owing to a mistake, it had in fact been collecting payload data that wasn't password-protected, but that the data collection was incidental, fleeting, and often fragmented. "Quite simply, it was a mistake," read a Google blog post. But according to the FCC's report, Google's payload data collection was by design.

Google quickly pulled the plug on such practices when they came to light. According to the FCC, a report from Stroz Friedberg said that as of May 6, 2010, it had verified that Google had stopped capturing payload data. By July, Google said that it had also removed all Wi-Fi sniffing equipment from its Street View cars.

By October 2010, meanwhile, Google admitted that "in some instances entire emails and URLs were captured, as well as passwords." But Google said it had discarded all such data and that it was putting new privacy controls in place to prevent a repeat scenario.

Rob VandenBrink, a senior consulting engineer at Metafore, said on the Internet Storm Center diary that privacy alarm bells should have been ringing from the start of the wardriving program, although in an engineering aside, he did applaud Google for at least using a well-designed, off-the-shelf tool--Kismet--to handle packet capture. "It was sensible that the engineer didn't go write a new tool for this--they used Kismet to collect the data, then massaged Kismet's output during their later analysis," he said. "Aside from the fact that anyone who's been in almost any SANS class would realize how wrong using the tool was, at least they didn't go write something from scratch."

The FCC's Street View report should serve as a lesson to any company pursuing technology projects that could lead to privacy or other legal questions: engineers aren't legal experts. "Long story short, this document outlines how the manager(s) of the project trusted the engineer's word on the legal implications of their activity," VandenBrink said.

Too often, in fact, businesses fail to properly assess the potential legal implications of their technology projects. Many companies simply "take their best shot at the 'do the right thing" decision,'" he said. "As you can imagine, if the results of a decision like this ever [come] back to see the light of day, it seldom ends well."

Still, how did a technology company of Google's caliber fail to head off the resulting Street View privacy debacle? "In Google's case, they have a legal department on staff, and I'd imagine that one of their primary directives is to keep an eye on privacy legislation, regulations and compliance to said legislation," he said. "Though you can't fault the legal team if the question never gets directed their way."

InformationWeek is conducting a survey to get a baseline look at where enterprises stand on their IPv6 deployments, with a focus on problem areas, including security, training, budget, and readiness. Upon completion of our survey, you will be eligible to enter a drawing to receive an 16-GB Apple iPad. Take our InformationWeek IPv6 Survey now. Survey ends May 11.

Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
User Rank: Apprentice
4/30/2012 | 4:58:27 PM
re: Google Street View Pursued Wardriving By Design
Do no evil....
Higher Education: 15 Books to Help Cybersecurity Pros Be Better
Curtis Franklin Jr., Senior Editor at Dark Reading,  12/12/2018
'PowerSnitch' Hacks Androids via Power Banks
Kelly Jackson Higgins, Executive Editor at Dark Reading,  12/8/2018
How Well Is Your Organization Investing Its Cybersecurity Dollars?
Jack Jones, Chairman, FAIR Institute,  12/11/2018
Register for Dark Reading Newsletters
White Papers
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: This comment is waiting for review by our moderators.
Current Issue
10 Best Practices That Could Reshape Your IT Security Department
This Dark Reading Tech Digest, explores ten best practices that could reshape IT security departments.
Flash Poll
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
PUBLISHED: 2018-12-13
XSS exists in FUEL CMS 1.4.3 via the Header or Body in the Layout Variables during new-page creation, as demonstrated by the pages/edit/1?lang=english URI.
PUBLISHED: 2018-12-13
XSS exists in FUEL CMS 1.4.3 via the Page title, Meta description, or Meta keywords during page data management, as demonstrated by the pages/edit/1?lang=english URI.
PUBLISHED: 2018-12-13
PHP Scripts Mall Entrepreneur B2B Script 3.0.6 allows Stored XSS via Account Settings fields such as FirstName and LastName, a similar issue to CVE-2018-14541.
PUBLISHED: 2018-12-13
IBM Security Guardium 10 and 10.5 is vulnerable to cross-site scripting. This vulnerability allows users to embed arbitrary JavaScript code in the Web UI thus altering the intended functionality potentially leading to credentials disclosure within a trusted session. IBM X-Force ID: 150021.
PUBLISHED: 2018-12-13
IBM Security Guardium 10 and 10.5 contains hard-coded credentials, such as a password or cryptographic key, which it uses for its own inbound authentication, outbound communication to external components, or encryption of internal data. IBM X-Force ID: 150022.