There's an infinite number of studies of ransomware lately, all breathlessly talking about how to fight this dangerous threat. They're all dangerously wrong. Ransomware is not the problem.
Focusing on fighting ransomware is like fighting a pandemic by focusing on masks. You fight a pandemic by focusing on reducing transmission and improving treatments. Reducing transmission does include masks, and also vaccines, distancing, contact tracing, quarantines, and various levels of restricting movement.
Back to computer security, the real problem is that organizations are unable to control their technological systems. A symptom of that is that criminals can deploy software, and that software can pick up and modify arbitrary files, but that is only a symptom, and addressing the symptom won't fix the disease.
A criminal gang could, with a modest increase in effort, change a pipeline company's billing software to randomly reduce all bills by an average of 7%, or sell coupons that give buyers a discount. They could insert malware into software sold by a networking company to compromise its customers.
That organizations cannot control their software is a problem with a thousand parts. Crucially, software is complex and flexible, and it's assembled from components in unique ways. The patterns involved in effective operation are unclear and rapidly changing.
Many of these emerging patterns, as practiced at Google, are described in an excellent and thought-provoking book, Building Secure and Reliable Systems. The book describes how Google has reconsidered how to build systems, and in doing so, produced a new set of patterns. It's tempting to describe these as cloud-native, but they predate the cloud, and are broader than the cloud — they include things that only a cloud provider can do, and they include desktop/end-user services that traditionally are in the realm of the help desk.
In fact, a critique of Building Secure and Reliable Systems is that many of the approaches they suggest, like rewriting the authentication system for all of your software, are hard even if you're Google, and appear impossible for mere-mortal engineering teams. But that doesn't mean the patterns are either wrong or not worthy of consideration.
One such worthwhile pattern is to use "least privilege" to isolate components from each other. In traditional desktop software like Windows or macOS, applications can do roughly anything the person behind the keyboard can do, including run random software. More recently designed operating systems behave differently. For example, if you're running on Chromebooks or iPads, you can't run arbitrary software on them. Ransomware is not the problem. Arbitrarily powerful software is a problem; the difficulty of preventing arbitrary software running on enterprise systems is a problem.
If your files are on a cloud system, then the cloud may notice that they're all being replaced by encrypted versions for you, or even that the patterns of access has changed in a way that's led to problems for other customers. This might be ransomware encrypting them, it might be someone (an employee or someone who's taken over an employee account) copying them. Ransomware is not the problem: The problems include the challenge of determining what a problematic pattern is, how to detect them across many enterprises, and how hard it is to respond. These tie to how we assemble components into useful and resilient systems.
Ransomware is not the problem; operating systems that give unrestricted access to applications were a more elegant design for a simpler time. There are other improvements that system designers can make that prevent both ransomware and bulk exfiltration.
For example, in macOS Big Sur, applications can only write to a few specific directories by default, and reading or writing to either the Documents folder or other places is restricted. MacOS, like Windows, has transparently moved the file dialog into the operating system so that it knows that a human is selecting files. The kernel tracks (and the activity monitor shows) how many bytes each application is sending or receiving. It would be easy (and perhaps annoying) to alert on unusual patterns.
Flexibility has enabled incredible innovation in very short times. The ability to assemble components into useful agglomerations makes those innovations cheap. We assemble components at all sorts of scales: OEMs make PCs from a diverse array of parts from different suppliers. Software makers build "solutions" from a mix of open source and commercial code. Some of that runs on operating systems like Linux, whose distros are mixes of open source packages and glue configuration; and those distros are combined into container images, similarly mixed. Our ability to combine these in varied ways contributes to innovation.
When we want to secure that, we often want to call on tools like isolation. We want to isolate packages and system users from each other (violated by Spectre- and Meltdown-style attacks), we want to isolate systems from each other. Isolation is a security and reliability tool, enabled by opacity. Opacity means that software can't reach across the boundary and arbitrarily change things on the other side. That is, "No user serviceable parts inside." That opacity may mean that Excel can't run Macros, or PowerPoint can't run arbitrary Actions. It may limit how "hook" extension points can be configured. It certainly adds to administrative load while improving security.
We need ways to construct, combine, operate, and observe software at new scales and in new ways. Worrying about ransomware distracts us from these challenges; solving them will solve ransomware in ways that enable more innovation and value creation.