As technology has become more ubiquitous in people's everyday lives, a new class of privacy threats has emerged in family, romantic, friendship, and caregiving relationships. Dubbed "intimate threats" by a recent academic paper in the Journal of Cybersecurity, these are the thorny risks that are intertwined with issues around location tracking, always-on monitoring or recording, online surveillance, and the control over technology accounts or devices.
Written by Karen Levy, a lawyer and sociologist, and information security luminary Bruce Schneier, the paper examines how the dynamics of different intimate relationships break the security model in a lot of systems. It examines real-world examples of this in action and also provides some recommendations for technology designers and security professionals to start rethinking how they build products and think about threat models and security use cases.
The use of technology in intimate relationships can quickly turn dark with very little recourse from the victim because the product was never designed to account for abuse cases.
"Facebook had a system for a while where you'd get your account back because they'd show you pictures and you'd click on the ones that are your friends, assuming that you know who they are but other people don't," Schneier says. "But your partner and your parents all know that stuff too. So it's a great system, but it fails in the intimate context. It fails when your boyfriend takes over your account."
While writing the paper, the authors quickly collected many illustrative examples of the abuse of technology not only having to do with covert monitoring, but also control over financial accounts or technology. For example, they pointed to the use of household-shared smart devices and thermostats increasingly being used by abusers and vengeful exes to wreak havoc on intimate partners' daily lives.
The difficulty is that there are a lot of complicated dynamics - both ethical and technical - to account for when mitigating intimate threats, says Levy, who explains that she and Schneier spent a long time debating whether to even call them "threats." They eventually chose to use that language because it is a common and useful way for the tech community to conceptualize these problems.
"It's complicated because the motivations in the intimate context are often pretty complicated and not necessarily nefarious," she says.
For example, a teenager might be told by his or her parents that they can have a later curfew only if the parent can track where their phone is. Or a parent may set up a webcam with full knowledge from their nanny to be able to get a happy glimpse of their little one during a break in an otherwise hectic workday.
"There's lots of ways in which we use this in perfectly beneficent and socially negotiated ways to take care of one another," she says. "And so that means that the line between this is okay and this isn't okay is going to be very variable, both within and across relationships."
To account for these complicated dynamics, the duo worked to break down some of the sociological dimensions to the issue, such as different motivations between both the user and the other affected parties, copresence, dynamic power-differential, and also the disparities in technical abilities between different people in a given relationship. They then cross-referenced these with some design recommendations across a number of different design implications.
"It's certainly not a roadmap or a checklist or anything quite so definitive, but it's a list of things to consider," Levy explains. "So one of them is, when you're designing, assume that the person that is buying the product is not necessarily your only user and that you should think of the people who are subject to being monitored as potentially your users too."
The idea is to get the technology-design community thinking critically and remembering that these relationship dynamics are always going to be at play within the user base.
"When you're designing systems, we really have to take these relationships into account," Schneier says. "They are not the exceptions, they are normal. It's not something we can ignore."