Yes, FaceApp Really Could Be Sending Your Data to Russia
FaceApp has an unprecedented level of access to data from 150 million users. What could its endgame be? We unpack three potential risks.
FaceApp, an app that offers special effects for photographs, has been downloaded and installed by more than 150 million people worldwide, according to consumer tech journalist John Koetsier, writing in Forbes. Koetsier writes that the most popular of these special effects is an artificial intelligence (AI)-enhanced photo filter that ages any faces in the photograph. This feature has gotten even more popular lately, leading to the app's privacy being called into question on a global scale.
FaceApp is developed and published by Wireless Lab, a company with headquarters in St. Petersburg, Russia. While Wireless Lab and its staff are based in Russia, company founder Yaroslav Goncharov told Forbes that all the app's storage and cloud resources are in the US and that the data collected by the app is hosted in the US, not Russia. Because FaceApp has such close ties to Russia, American political officials have raised concerns about the overall security of the app. But the potential issues go beyond where the data is hosted.
Here are three risks to unpack before deciding to use FaceApp:
Risk 1: Terms and Conditions
As with any social media app, the terms are the bulk of your contract and the primary mechanism that is supposed to protect you (the user). But terms are also a way for many companies to indicate what they anticipate they'll do with your information in the future, often long before actually acting on these plans.
I've been in the security space for nearly 30 years, but FaceApp's set of terms are among the worst I have seen, for example:
● You assign FaceApp irrevocable global rights to use your images or data as it sees fit without any need to compensate or inform you.
● FaceApp can continue to hold your images and data even after you have requested your information be deleted.
● The company reserves the right to share the data with any third party it chooses without any need to inform you.
● It reserves the right to host the data in any country it chooses.
As shocking as some of the terms are, you will find very similar language in many well-known social media apps, including Facebook, according to Dalvin Brown, in USA TODAY. This approach is almost certainly incompatible with legislation such as GDPR, which means Wireless Lab is ignoring international privacy regulations.
In response to widespread criticism, Goncharov is quoted in Forbes suggesting that the company might consider updating its terms. However, the company still hasn't made any concrete promises. For now, FaceApp's terms make it seem as if the company is absolutely collecting your data, has long-term plans for it, and is not obligated to listen to any request or demand you may have about the future of that data.
Risk 2: Murky international legal regulations around data privacy
It's not just the terms and conditions that are problematic. Wireless Lab is operating in a country that has very different legal processes and privacy legislation than the US, and this should be a significant red flag. If it does something you don't like, you likely have very little or no legal recourse.
As this story has developed, it has become clear to me that Wireless Lab's statement that the app is wholly hosted in the US may not be the complete picture. Host records indicate that one of the hosts the app communicates with is, in fact, in Russia. While it's not clear what data is being sent to this Russian host, the fact that it's there — even after the developer stated everything is in the US — is concerning.
Risk 3: FaceApp's Endgame
FaceApp has an unprecedented level of access to data from 150 million users. What could the company's endgame be? This is where we have to speculate. To start, we should first look at what it is harvesting:
● Your photos and contextual personal information.
● Your phone information (browser, serial number, IP address, configuration information, some location information).
● Details about your other apps, the OS on your phone, social media accounts and apps.
● Cookies, sign-in tokens, and any authentication information you share with it (for example, if you choose to log in with Facebook, it gets access to your Facebook access tokens and profile information).
● If the app is downloaded on Android, it can access your call history, contacts, logs, more-detailed location information, messages, and more.
This list is certainly not exhaustive; it merely encompasses the most obvious data to which the app has immediate access.
What could the company be doing with this data? On the obvious end of the spectrum, detailed information about more than 150 million people is something advertisers would pay good money for. But from an intelligence perspective, this is a highly useful and current database of people all over the world and their connections.
For example, a current, AI-enhanced database like this is something that people developing facial recognition need. One of the biggest flaws in current facial recognition technology is that it is only as good as the data used to train it. As a result, most models are skewed toward faces from the region where the technology was developed. A database like this could provide an extremely diverse catalog of real faces to train facial recognition technology.
Whatever the company's endgame is, one thing is very clear: As consumers, we need to get better at policing those with whom we share our data. The fact that almost all social media applications and services have consumer-unfriendly terms should be of great concern. As the saying goes: "If you're not paying for it, you're not the customer; you're the product being sold." It's never been more important to heed this warning.
Related Content:
About the Author
You May Also Like
DevSecOps/AWS
Oct 17, 2024Social Engineering: New Tricks, New Threats, New Defenses
Oct 23, 202410 Emerging Vulnerabilities Every Enterprise Should Know
Oct 30, 2024Simplify Data Security with Automation
Oct 31, 2024