FaceTime Bug an AppSec Fail
Apple has shut off Group FaceTime while it prepares a fix for a newly found security flaw found by a 14-year-old gamer.
January 29, 2019
The glaring security flaw in FaceTime that has rocked the Apple community since it went viral late yesterday was actually was first found on January 19 by a 14-year-old who stumbled upon it while setting up a group chat with friends playing Fortnite.
Apple disabled the Group FaceTime service yesterday, January 28, at 10:16 p.m. PDT, after word of the bug and a video of how to abuse it spread like wildfire over social media and caught the attention of security experts. And the company — which late yesterday said it will issue a patch for the bug this week — was a little late to the party: Michele Thompson, the mother of the teenage gamer, Grant, who found the flaw, told media that she attempted to contact Apple about the bug but got nowhere. She even tweeted about it on January 20 after not getting a response from Apple Support:
My teen found a major security flaw in Apple's new iOS. He can listen in to your iPhone/iPad without your approval. I have video. Submitted bug report to @AppleSupport...waiting to hear back to provide details. Scary stuff! #apple #bugreport @foxnews
The vulnerability allows a Group FaceTime caller to access your audio and video even if you don't pick up the call. Grant found that after trying to call one friend via FaceTime who didn't pick up and then adding a second friend to the call, he was able to hear the microphone of his first friend, even though the boy hadn't picked up. He could hear the ringing sound on the first friend's phone, he told NBC News.
Aside from an obvious failure of communication in Apple's process for vulnerability reporting, the painfully simple bug also exposes a likely collapse in the final vetting phase of the vendor's software development life cycle. With a company like Apple, known for advanced privacy and security features in its software and hardware architecture, the bug demonstrates how even the best development programs can miss security problems.
Chris Eng, vice president of research at Veracode, says the Group FaceTime vulnerability appears to be a design flaw that should have been detected during Apple's threat modeling process, a step-by-step exercise where developers explore potential use and abuse cases in an application. The development team walks through a final flight-check of sorts, exploring usage possibilities such as: What if the user adds another contact's number to the Group call? What if the user adds his own number?
"It seems like an obvious scenario you'd expect them to go through in workflow in handling [potential] abuse, or they didn't account for that particular case," Eng says. "It seems straightforward enough that a light whiteboard review of this thing" even would have caught it, he says.
This design flaw isn't a deep architectural issue in Group FaceTime, Eng notes, but, rather, the result of a flawed new feature that wasn't fully vetted for problems like this one. "I wonder if they were under time pressure, or they wanted to squeeze it into a release," he says.
Apple had not yet responded as of this posting to a request for details on the flaw and what happened with the Thompsons' reporting of it.
The good news about the bug, according to Eng, is that it appears to have a limited impact on FaceTime users. "It's a bad bug, no question, but it's not like you have an unlimited spying tool," he says, given that the early tests show it provided a short window of audio and video access.
"Hopefully, they'll [Apple] go back and do a root-cause analysis" to determine where the threat modeling fell apart for the flawed feature, he says.
Apple's development process for the FaceTime code appears to have missed the mark, notes Chris Pierson, CEO of security firm BlackCloak. "Errors and problems in coding and implementation happen all the time in the software process. That is why static and dynamic security testing is critical as well as a robust Q&A [quality and assurance] process," he says.
Pierson echoes Eng's theory that there was likely an oversight in testing out any issues with the app before launching it. "In this case, it looks like the Q&A process failed to identify this risk in its preliminary or regression testing models. It could be a failure in the process itself, or a failure in imagination to test something like these steps. In any case, an enormous oversight," Pierson says.
The Fix
While architectural design flaws in software are difficult to remedy, this type of feature flaw is not. "I don't think this hard to fix," Eng notes. Apple may have to remove the ability to add a user's phone number to the Group FaceTime call, for example, he says.
"This doesn't [appear to] affect the whole FaceTime architecture and product. I think it's one use case where they didn't think about how it was going to handle this particular group-calling feature," Eng says.
It's a chilling reminder that even major companies with mature secure development programs can miss things. "Even though Apple has gone through great strides to protect their users' information, this latest bug is yet another reinforcement that privacy continues to remain a major concern regardless of your company's size or security and privacy investments," notes George Gerchow, CSO at SumoLogic. "It's also another reminder that nobody's data is 100% safe and that it's all of our responsibility to be more diligent in protecting the privacy of our customers' sensitive information against future vulnerabilities."
Related Content:
About the Author
You May Also Like
Cybersecurity Day: How to Automate Security Analytics with AI and ML
Dec 17, 2024The Dirt on ROT Data
Dec 18, 2024