"We're not seeing a lot of malware so much ... but we are seeing a lot of privacy concerns from apps that are sharing information that people aren't aware of, or apps that have not been built securely," says Michael Sutton, vice president of security research at Zscaler ThreatLabZ.
For example, he says that several months back when his researchers were doing work in the mobile space, they ran into certain iOS apps that would ask for passwords to popular services, like GoogleDocs.
"They would communicate with services, like GoogleDocs or Dropbox, and upload things and store backups," Sutton says. "All of those authentication credentials were just stored in clear text on the backup of the file, and so anybody who got a back-up of your phone could go through that in plain text."
According to Sutton, the mobile space is such a "land grab" right now that businesses are desperate to have mobile apps and are willing to outsource to developers who might not be very competent at their jobs, or who just aren't given enough time to do a security review.
"I think the worst part is people think, 'I downloaded it from the store. It's safe,'" he says. "But that's not necessarily the case, and the end users mistakenly think that the gatekeepers are watching their backs."
In fact, in many cases it might not even be in the developer's best interest to keep users' privacy intact.
"One of the big reasons that there's a privacy issue is that mobile apps are monetized differently than traditional software," says Chris Wysopal, CTO of Veracode. "Usually they're low-cost, or they're free and ad-supported. What that means is they're going to need to market efficiently to the people who are using these ad-supported apps, so one aspect is getting the individual's profile, finding out things like sex, age, where they live, and so on. All those things are hugely important for targeting advertising."
The way that these ad-supported apps work is that the developer receives money from an advertising company that supplies a library the developer will link to within the application.
"The application developers might not really even be aware of what the ad libraries they're linking to are doing; they don't have the source code of what that ad library is doing. It is just a black box to them," Wysopal says. "It's just given as a requirement to install, but it turns out that the ad libraries piggyback on the permissions that the apps ask for and try to exploit whatever permission they have."
[Debate whirls around the hype of mobile malware and the solutions we have to fight it. See Rethinking Mobile Security.]
Further exacerbating the problem is the fact that most developers tend to ask for more permissions than they need. According to Wysopal's colleague at Veracode, Chris Eng, vice president of research, they'll frequently see simple games of tic tac toe asking for every permission under the sun. Obviously a game like that doesn't need access to the phone's microphone, but it's still asking for it. And many times the users don't even realize what permissions they're granting upon installation.
According to Chet Wisniewski, senior security adviser at Sophos, users usually operate either under the Apple model, where the company's app store overseers determine for the user whether permissions are appropriate, or the Android model, where there's an open-door policy, but the user is asked whether they wish to grant certain permissions. The verbiage for this is so obscure, and there's no way to tick or untick policies and still run the app, so more than likely the user is going to just say "yes" to everything.
"Trying to determine what the heck it means when it asks for permissions is tough when you don't know what it means," Wisniewski says.
Wysopal agrees, saying that if someone sees that an an app wants to communicate over the Internet, they're instinct is to say, 'OK, fine.'
"They don't realize that that means your flashlight app could be communicating with some server somewhere," he says.
Where all of these unchecked permissions become scary is when they get to the point where an app could not only profile you, but potentially put together your real identity.
"When you sign up for something, you give an email address or your Facebook login, and you can tie all of this profile information to a real individual, now you have databases that can be created of this individual," Wysopal says. "We know where they live because of their GPS information, where they sleep at night, where they work, and where and when they go shopping. It can start to build a pretty detailed view of your life because you always have your phone with you, and if you're always interacting with social networking and messaging and email on the phone. So basically your whole life is out there."
What's more, when enterprise data mingles with personal data, that information is at risk, as well. Wisniewski says that with Android, it is possible to hook into the Google API and create rules that deny or allow app downloads based on the permissions. But iPhones are a harder nut to crack.
"If I were an IT manager, I would like to be able to say, 'Sure, allow things that can tell what the phone state is, but don't allow things that can record from the microphone or don't allow things that can read from this particular partition where sensitive data is stored," he says. "But Apple doesn't allow that today. There's a lot of power there, and if Apple were to embrace it, that could be one of the best roads forward."
Have a comment on this story? Please click "Add Your Comment" below. If you'd like to contact Dark Reading's editors directly, send us a message.