"I can tell you from the Department of Justice perspective, if that drive is encrypted, you're done," said Ovie Carroll, director of the cyber-crime lab at the Computer Crime and Intellectual Property section in the Department of Justice, during a recent keynote address at a computer forensics conference in Washington, D.C. "When conducting criminal investigations, if you pull the power on a drive that is whole-disk encrypted, you have lost any chance of recovering that data."
That anecdote is cited by digital forensics expert Simson Garfinkel in an analysis of iPhone security published in Technology Review, in which he asserts that iOS now offers "hardened, military-grade encryption" that's both "tough" as well as "easy for consumers to use."
But that evolution will have societal repercussions, he warned. "In its efforts to make its devices more secure, Apple has crossed a significant threshold," said Garfinkel, who's an associate professor at the Naval Postgraduate School. "Technologies the company has adopted protect Apple customers' content so well that in many situations it's impossible for law enforcement to perform forensic examinations of devices seized from criminals."
[ Apple's response to text messaging bug isn't much of an answer. Apple Suggests iMessage As SMS Bug Work-Around. ]
What makes iOS security so good? While early iPhones were insecure, current Apple iOS devices use the advanced encryption standard (AES) algorithm. "After more than a decade of exhaustive analysis, AES is widely regarded as unbreakable," said Garfinkel. "The algorithm is so strong that no computer imaginable for the foreseeable future--even a quantum computer--would be able to crack a truly random 256-bit AES key. The National Security Agency has approved AES-256 for storing top-secret data."
Furthermore, it would be quite difficult to find a way to decode the AES key used by an iOS device. According to an Apple security white paper released earlier this year, each iOS device actually has its own AES key. "The device's unique ID (UID) and a device group ID (GID) are AES 256-bit keys fused into the application processor during manufacturing," according to Apple. "No software or firmware can read them directly; they can see only the results of encryption or decryption operations performed using them." Apple also said that neither it nor its suppliers keep records of AES keys.
Another security feature, or deterrent to forensic investigators, is the PIN lock on iOS devices. If enabled, a forensic investigator must try every possible PIN combination until a match is found. But because iOS devices can be set to wipe themselves after 10 failed PIN-access attempts in a row, Garfinkel noted that investigators must run specialized software on the iPhone itself, which limits PIN guesses to 80 milliseconds each. Given that limit, brute-forcing a four-digit PIN would require no more than 13 minutes, but a 10-digit PIN could take up to 25 years.
Assuming users employ a long-enough PIN, does that really mean that the latest generations of Apple devices, including the iPhone 4S and iPad 3, are law enforcement-proof? "I'm skeptical," said Bruce Schneier, chief security technology officer of BT, in a blog post reviewing Garfinkel's analysis.
To make his point, Schneier quoted from this passage in his 1996 book Applied Cryptography: "There are two kinds of cryptography in this world: cryptography that will stop your kid sister from reading your files, and cryptography that will stop major governments from reading your files."
"Since then, I've learned two things: 1) there are a lot of gradients to kid sister cryptography, and 2) major government cryptography is very hard to get right," he said in the blog post. "It's not the cryptography; it's everything around the cryptography."
The "everything" he refers to so often involves all of the nuances that accompany digital security, which relies on computers, and of course computers have bugs. In addition, Schneier said, translating cryptography from the realm of mathematics to practice can be quite difficult.
Earlier this year, for example, researchers from Moscow-based digital forensic toolmaker Elcomsoft analyzed 13 Apple iOS password managers--a.k.a. password keepers, wallets, and safes--to see if they securely stored passwords. Elcomsoft's interest was more than academic, since the company has long sold tools used by law enforcement agencies to crack iPhone data security. It wanted to know if password safes might provide a further hurdle to forensic investigators. But despite the tools' stated claims--including one boasting of "military-grade encryption"--the researchers found that only one tool, and a free one at that, really did a good job of making the passwords it stores really difficult to retrieve.
Another weak link in the overall iPhone information security model--and not necessarily something Apple could rectify--includes any digital ecosystem attacks that might yield the same data that's stored on an iPhone. Notably, in the "epic hack" of technology journalist Mat Honan that occurred earlier this month, an attacker managed to access Honan's iCloud account after social-engineering--tricking--Amazon.com customer service. In such a scenario, an attacker could easily restore what was stored on the iPhone--especially if the phone was saving unencrypted backups to iCloud, and then read the data. Likewise, accessing an iCloud account would also reveal much of the information that was stored on the user's device.
In other words, iOS may offer extremely good mobile device security. But don't bet on it being uncrackable. Also don't discount techniques that a dedicated attacker might use to see the data that's stored on the device, albeit without cracking the device itself.