Apple earns good grades for stating the purpose of the data collection in a forthright manner, but it loses marks on account of how it handled the data in the first place, because of the resulting security and privacy exposure. Apple blames a programming error for, among other things, storing up to a year's worth of location data on any given phone. From the outside, of course, this data storage didn't look innocent: iPhones were "phoning home" location data to the Apple mother ship on a regular basis, as well as storing the data on computers that synchronized with iOS devices, where it could be recovered using forensic data techniques. Interestingly, security researchers--and the cops--have already known about and have been exploiting this capability, said security researcher Alex Levinson in a blog post. Levinson is also the lead engineer for iOS forensic software vendor Katana Forensics, which develops Lantern software that already includes iOS data location retrieval capabilities.
According to a statement released by Apple on Wednesday, it couldn't care less where you are. Rather, it wants its devices to know your location--not least to support mapping and camera geotagging features. Accordingly, it pushes and pulls a crowd-sourced database of hotspots and cell towers to its devices, instead of relying on global positioning system hardware, which can take minutes to acquire a signal, if one is available.
Of course, this location data can be retrieved from the device, or a device with which the smartphone synchronizes. So Apple has promised to stop storing so much information, to honor opt-outs (a "don't track" setting in the phone), and with the next major release of iOS, to begin encrypting the data it transmits. Apple also said it would begin limiting the amount of location data each phone stores to seven days' worth.
Will that remedy the issue for smartphone users? In the wake of this incident, security and privacy experts have been reviewing Apple iPhone security and finding that in addition to location data, there's a lot of other information that iOS also doesn't store securely.
Notably, Michael Sutton, VP of security research for cloud security firm Zscaler, said in a blog post that it's trivial to recover the authentication credentials for numerous applications--including "Evernote, Google Docs, Apple's iDisk and any WebDav enabled server"--because their passwords are stored in plain text. "It is particularly concerning that Google Docs and Apple's Mobile Me are on this list," he said. "Both services leverage single sign-on for their various online services, so knowledge of these credentials would also provide access to Gmail and Apple's MobileMe email service."
Interestingly, Google has drawn fire for not vetting Android applications before it adds them to the official Android Market, while Apple is often lauded for ensuring that its applications perform as specified. But Apple's new-app review process may not be oh-so security-intensive, said Sutton.
"Despite the fact that Apple must bless all apps before hosting them in the App Store and is very willing to take a 30% cut for doing so, they're clearly concerned more with blessing the 'user experience' as opposed to security," he said. "Storing passwords in clear text is security 101. If I can spot it in 15 minutes, surely they can have processes in place to identify and prohibit at least basic security issues."
Both the crowdsourced data collected by Apple and the insecure storage of credentials illustrate how today's smartphones can produce unintended consequences for enterprise security. "Security and privacy concerns in the enterprise are shifting, and one of the key drivers for that is the influx of personal devices--both smartphones and tablets--coming into the workplace," Ahmed Datoo, VP of marketing at mobile device management vendor Zenprise, told me in an interview. "In the past, companies bought devices on behalf of employees, and therefore it was easier for IT to do things like monitor usage of the device. Privacy around employee-owned devices, however, becomes a much bigger deal."
Furthermore, users want access to all of the features and functionality that they've paid for, and security and privacy issues might not get in the way. For example, a recent study from secure email provider VaporStream found that 28% of people said they'd intentionally shared sensitive information via email, in violation of regulations. The rationale is simple: It was easier to share the information that way, regardless of what corporate security policies, not to mention laws, might dictate.
Enterprise administrators take note: Users don't want to submit to controls. But when left to their own devices--in both senses--it's likely that features and functionality will trump corporate data security issues. Accordingly, and based on having a full and open dialog with employees, it's time for IT managers to spy on people's devices too--or at least monitor and manage their security and privacy settings.