On the heels of KeyRaider's attack on jailbroken iPhones, attackers show they can hit non-broken devices too, sneaking 39 weaponized apps onto the official App Store and around Apple's best efforts to lock down its developer environment.

Sara Peters, Senior Editor

September 21, 2015

5 Min Read

Although Apple's closed development environment has largely succeeded in keeping the App Store relatively free of malicious Mac and iOS apps, Apple's borders have begun to show some weak spots. XcodeGhost, detailed by Palo Alto Networks, is the most recent example, and the most critical.

XcodeGhost is a Trojanized version of Apple's application development software, Xcode. Attackers uploaded it to Chinese cloud storage service Baidu Yunpan -- a regional, third-party alternative to the Apple Store where download times are shorter for iOS and Mac developers in China. Innocent app developers then used XcodeGhost to write apps and upload them to the App Store, never knowing that those apps were malicious.

In this way, 39 iOS apps were weaponized, including WeChat, one of the most popular instant messaging applications in the world, thus impacting hundreds of millions of users worldwide, according to Palo Alto. It also infected banking, stock trading, gaming, and other apps.

[By 2020 there will be 25 billion Internet of Things devices...all full of vulnerabilities. What can we do to solve the problem now? Don't miss the next episode of Dark Reading Radio, "Fixing IoT Security," this Wednesday, Sep. 23 at 1 p.m. Eastern Time.]

The malware payload itself uploads device and app info to the command-and-control server, can receive commands from the C2 server that will issue fake alert messages to social engineer users into entering credentials, "hijack opening specific URLs based on their scheme, which could allow for exploitation of vulnerabilities in the iOS system or other iOS apps," and "read and write data in the user’s clipboard, which could be used to read the user’s password if that password is copied from a password management tool." It's also been seen phishing for users' iCloud passwords.

Yet it's not XcodeGhost's payload that security experts find interesting; it's what it means for the security of the Apple development environment.

"One very interesting aspect of this incident is that that the developers of the apps had no knowledge that their own code was being used to carry malware," says Chris Wysopal, CTO and CISO of Veracode. "It was the modified development environment, Xcode, that introduced the payload."

Last month at the Black Hat conference, Synack researcher Patrick Wardle unveiled exploits that circumvent Gatekeeper, Apple's mechanism for preventing unsigned code from running on iOS and Mac. Yet, that was a proof-of-concept exploit by a researcher, not an attack in the wild.

Just three weeks ago, a new family of iOS malware, KeyRaider, stole 225,000 legitimate Apple accounts. Yet, Apple's official verification process remained relatively unscathed, because KeyRaider only affected devices that have been jailbroken.

Conversely, XCodeGhost is in the wild and it affects supported iOS devices, not just jailbroken ones. 

The XcodeGhost attackers found weaknesses in the Apple verification system. First, they took advantage of the fact that some app developers do not use the official App Store to download Xcode.

"Due to internet restrictions and longer download times, people in China are used to using local services" like Baidu Yun Pan, says Lancope vice president of threat intelligence Gavin Reid. "This should be a wakeup call for software developers to really pay attention to their source materials. Mostly US and European developers download Xcode directly from Apple, making a repeat of the same problem unlikely.”

Using a regional service to download Xcode is just one of several risky behaviors app developers regularly undergo, according to Tod Beardsley, security research manager at Rapid7. "The success of XCodeGhost illustrates that skipping certificate checks and acquiring untrusted software," by disabling or bypassing Apple's Gatekeeper code-signing validation tool, "is a fairly normal practice, even for established software companies with millions of users," he says. 

"The important thing to stress is that these behaviors don't usually lead to major compromises of developer security," says Beardsley. "Most of the time, this risky behavior doesn't end up causing any harm at all. Skipping certificate checks is a lot like jaywalking; most of the time, everything turns out fine. It's not that developers are dumb and don't know the risks; they simply consider the risk extremely unlikely, and if it's slightly more convenient to ignore one or two security best practices, they will proceed accordingly."

Wysopal says this case shows that developers need to start paying more attention to security. "Analyzing the compiled code for vulnerabilities and malware using technologies such as binary static assessment and behavioral analysis to detect if malware has been injected between development and distribution should be mandatory before apps are ever published,” he says.

Paco Hope, Software Security Consultant at Cigital says the process needs to begin earlier. "Analyzing binaries after they are built or penetration testing web and cloud apps after they are deployed provides limited assurance against vulnerabilities that are egregious and obvious," he says. "Secure software begins earlier, like when it is designed and developed. And there are no silver bullets—no tools that simply take care of the problem so that the people don’t need to do it themselves. It is important to incorporate security throughout the development process, right down to the provenance and selection of the development toolchain itself.”

Of course, attackers didn't actually need to involve third-party app developers at all. As Palo Alto reported:

XcodeGhost disclosed a very easy way to Trojanize apps built with Xcode. In fact, attackers do not need to trick developers into downloading untrusted Xcode packages, but can write an OS X malware that directly drops a malicious object file in the Xcode directory without any special permission.

Nevertheless, Beardsley has an optimistic viewpoint. "Given that little damage was done, this event was effectively a drill that provided a valuable object lesson in risky decision making. Ultimately, XCodeGhost may help influence more secure behavior and provides an incentive for Apple to make sure that regional distributions of core programming tools are at least as easy to use as their ad-hoc counterparts.”

About the Author(s)

Sara Peters

Senior Editor

Sara Peters is Senior Editor at Dark Reading and formerly the editor-in-chief of Enterprise Efficiency. Prior that she was senior editor for the Computer Security Institute, writing and speaking about virtualization, identity management, cybersecurity law, and a myriad of other topics. She authored the 2009 CSI Computer Crime and Security Survey and founded the CSI Working Group on Web Security Research Law -- a collaborative project that investigated the dichotomy between laws regulating software vulnerability disclosure and those regulating Web vulnerability disclosure.


Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights