Hack My Google Glass: Security's Next Big Worry?Wearable computing devices must strike a difficult balance between security and convenience. A recent episode involving Google Glass and malicious QR codes raises questions.
Are wearable computing devices the new big security threat?
That's one question lingering after Lookout Security last month detailed an insidious hack attack against Google Glass: Just by getting Glass to "see" a malicious QR code, an attacker could force a connection to a malicious Wi-Fi or Bluetooth connection, then eavesdrop on all communications. Admittedly, the attack wouldn't have triggered a countdown to global doom, but it did highlight the automated, promiscuous network-connecting habits of mobile devices, Glass included.
Therein lies a problem with wearable computing devices: They lack either physical or virtual keyboards, and thus require a relatively greater degree of automation than your average Android device or iPhone. With that automation, however, comes the risk that the device may automatically do something bad, from either an information security or privacy perspective.
[ Could a kill switch help? The Trouble With Smartphone Kill Switches. ]
In some respects, this is a good problem for the wearable computing field to have. For years, it was hobbled by awkward input mechanisms -- corded keyboards, joysticks, trackballs. But in this age of small, high-speed processors, voice recognition and relatively ubiquitous Internet connectivity, the release of Google Glass inaugurated people literally being able to tell their glasses what to do.
Unfortunately, as the Glass QR vulnerability -- patched by Google in June -- illustrates, wearable computing faces still some tricky security and privacy questions. Furthermore, useful solutions to these problems may not yet be on hand.
One problem is user authentication. For starters, unlike a smartphone, Google Glass doesn't offer access restrictions based on passwords or a PIN. That means a thief could easily access any Google account tied to a stolen device, warns InformationWeek columnist Jerry Irvine, who's a member of the National Cyber Security Task Force. Cue the need for restricting what these "bring your own" (BYOD) devices can do, and when. "If an organization doesn't have a BYOD strategy, the emergence of Glass can be a compelling argument to get one in place," said Irvine, who's also the CIO of Prescient Solutions.
Security managers will have many more options when such devices get rolled out by the IT department and tied to being used in specific environments. For example, Duncan Stewart, a research director at Deloitte, told the BBC that wearable computers could be especially useful for workers in environments that don't currently allow for smartphone use. "Someone driving a forklift in a warehouse can't use a PC or smartphone because they will crash into someone," Stewart said. "But imagine if they can drive around and be able to pinpoint a pallet and then the particular box they need on that pallet."
There are numerous security risks that could be blocked outright in that scenario. "There's a difference between a general use computer and a specialty use computer," Bob Rosenberg, CTO of startup facilities management service BluQRux, said in a phone interview. The latter, notably, can by heavily locked down, for example to only allow a white list of approved apps to be installed, and to block access to any website except for a preapproved list.
1 of 2