When Facebook removed the messaging capability from its mobile application and required users to download a separate Messenger app to chat, the social media giant ignited a firestorm of backlash over the troves of sensitive data the app collected.
Jonathan Zdziarski, a forensics researcher and self-proclaimed hacker, tweeted that the app “appears to have more spyware type code in it than I’ve seen in products intended specifically for enterprise surveillance.” He told Motherboard that it seems “Facebook is running analytics on nearly everything it possibly can monitor on your device.”
In response, an engineer, identified as @lucyz, who worked on Messenger explained on Twitter that developers use all the analytics to make the app faster and more efficient. She tweeted that “Analytics showed us people were using Like stickers a bunch, so we moved that feature so people can send in fewer taps.”
Messenger engineers wanted to scoop up all that data to make the app better for users. That’s an engineer’s job, after all. But didn’t company policymakers realize that capturing so much sensitive information would alarm users, possibly increase the likelihood of data breaches, and potentially put consumer data at risk?
No, because those issues likely weren’t part of the discussion about the data collection in the first place.
As in many similar instances, Facebook’s engineers and policymakers probably didn’t consult each other during the software-building process, which made it impossible to take a unified approach to user privacy and security. A better partnership between the finance, legal, and security areas (aka the policymakers) needs to happen in the software development process.
Without this collaboration, a company risks data breaches, damage to its reputation, and lost customers. And after the scourge of recent data disasters, it’s more important than ever to bridge the gap between policymakers and engineers.
Perceptions create dissonance
One reason such isolated silos exist within companies is that engineers typically see policymakers as barriers. They don’t think policymakers understand what they do and believe that policymakers will ultimately just impede their work. It’s easy for engineers to put their heads down and build, forgetting that the software ecosystem involves everyone in the enterprise.
On the flip side, policymakers see engineers as hackers — necessary but unreliable.
A lead security practice owner recently told me, “Software engineers know nothing about security; hence, we have completely given up on them and simply assume they will be writing horrific code.”
But the reality is that both sides need to be able to work together to build a process that’s transparent. Building a piece of software isn’t simply a matter of arranging code to accomplish a task. It’s much more complicated than that, and there are serious consequences for failing to collaborate.
Apple came under fire this summer over iCloud security when nude celebrity photos were posted online. A month after the leaks, Apple implemented tougher security measures to bolster iCloud, but the damage was already done.
A reactive approach isn’t sufficient. As more breaches are uncovered and dissected, companies that handle sensitive personal data will undergo tougher scrutiny than ever.
If corporate executives don’t figure out a way to bring their policymakers and engineers together and address these issues during the development phase, the government will likely step in to fill the void. The end result will be strict regulations that do little to make the web safer or more transparent — but will instead betray the core concept of the Internet as a space to share knowledge freely and equally.
Companies must dismantle silos
There is a growing recognition that something has to change. Technology companies are at a turning point, and the organizations that are willing to break down barriers between engineers and policymakers will be the ones that come out on top.
The Cloud Security Alliance and the Software Assurance Forum for Excellence in Code (SAFECode) have partnered to promote software security best practices that can help companies avoid common cloud computing threats. The CSA has also partnered with the International Association of Privacy Professionals to host the IAPP Privacy Academy and CSA Congress, which brought together privacy and cloud security professionals for discussion around these issues. In addition, the recently announced Stanford Cyber Initiative aims to address opportunities and challenges associated with new technologies, including privacy and security.
More companies are sure to follow suit, and it all starts by getting policymakers and engineers in the same room to talk about company goals. Both sides must understand and accept that software has become an ecosystem that touches everyone in the process, and leaders must build their teams with business stakeholders working alongside engineers and designers.
Transparency is the best way to mitigate privacy concerns and reduce the risk of data breaches. By encouraging cross-discipline collaboration during the development phase, businesses can ensure they’re building products that are user-friendly and secure.