A lack of precision in our terminology leads to misunderstandings and confusion about the activities we engage in, the information we share, and the expectations we hold.

Sounil Yu, CISO and Head of Research, JupiterOne

November 9, 2022

5 Min Read
Lock on a digital background
Source: Skorzewiak via Alamy Stock Photo

The words safety and security are often the same in many languages. That is also true in the world of cyber, where we frequently say cybersecurity when we really mean cyber safety. They are, however, distinct concepts, and the lack of precision in our terminology leads to misunderstandings and confusion about the activities we engage in, the information we share, and the expectations we hold.

To simplify the distinction between safety and security, it helps to put another descriptor in front of these words. For example, food safety practices include hygiene, third-party inspections, and checklists. Food security evokes concerns about the shortage of baby formula, poisoning of the food supply, and starvation. Food safety and food security are not the same.

Cyber Safety ≠ Cybersecurity

Similarly, cyber safety and cybersecurity are not the same. If you believe that compliance does not equal security, perhaps it is because compliance is about safety. Adherence to good safety practices generally improves the quality of the output while security often delays the output. Good safety practices don’t eliminate the possibility of intentional compromise, but practices that promote higher quality outputs enable investigators to quickly rule out accidental causes.

If you wonder why some types of information in cyber are widely shared and others are not, consider that regardless of geopolitical affiliations, we openly share nuclear safety practices but not nuclear security practices. We naturally tend to be transparent about safety but not about security. We generally want safety measures to be very visible, obvious, and well known. Hotels and airlines prominently and repeatedly share the measures they have taken to ensure our safety. Most manufacturing environments prominently showcase a sign relaying how long they have gone without a safety incident. 

Conversely, we tend to keep security measures invisible and unknown (unless we explicitly want to deter attackers through overt displays of guns, guards, and gates). This mindset extends to information sharing and may explain our general reluctance to voluntarily share security information with outside parties (and even internally).

Safety measures may be hidden from view in some cases, such as with car airbags and elevator brakes. Even so we will see inspection certificates to demonstrate that the minimum safety standards have been met. We are subjected to numerous and different inspections in the world of cyber, but the results are often hidden from public view. If routine assessments such as SOC2 and ISO27001 are more akin to safety inspections, perhaps these results should be made public by default (as with SOC3) to communicate when we have met minimum cyber-safety measures.

Taking Personal Responsibility

Individual choices have a direct impact on our safety. For example, most of us know what steps we can take to improve our personal hygiene and are appalled when others neglect or ignore such simple steps. Security on the other hand, is often seen as someone else’s responsibility with the individual usually limited to a passive "see something, say something" role.

Safety requires active participation from everyone and most people embrace safety measures as a personal responsibility. Individuals can see how they can directly contribute to the improvement (or deterioration) of safety. We can instill a greater sense of personal responsibility and accountability among an organization's stakeholders to maintain proper cyber hygiene by appropriately recasting many common cyber activities that we ask of others (e.g., patching) as actions to promote cyber safety.

To remind us of our personal responsibility for safety, we receive safety awareness briefings with every flight (with the flight attendants pleading with us to pay attention even if we've already heard it before). If you try to drive away without buckled seat belts, our vehicles chime in with pleasant tones. In many other domains, safety awareness happens regularly and is not reserved for just one month in October.

Making Safety Usable

Framing our activities as safety can also help cyber practitioners understand that we cannot go overboard on cyber-safety measures. Many of us may desire for all vulnerabilities patched immediately but requiring food service workers to clean food preparation areas whenever a speck of dust falls would bring productivity to a grinding halt. Similarly, insisting on immediate patching of all code vulnerabilities may hamper software development. For safety measures to be truly effective, we must understand and establish reasonable margins of safety. The fact is that most vulnerabilities do not need to be patched right away, and we can increase our margin of safety by implementing compensating cyber-safety controls, allowing us to postpone patching for a more opportune time.

Importantly, these cyber-safety controls must be easy to use with little to no room for operator error. Unfortunately, we are far from that today. Our current cyber-safety mechanisms operate like child safety seats from the 1980s: Parents must figure out a complex harness system, and if they get it wrong, they are treated as idiots. In our digital environments today, the user is often blamed for being the weakest link despite confusing and unhelpful interfaces.

Making child safety seats easier to install required cooperation from both the car and seat manufacturers as well as a federal requirement to comply with a Latch system. Personal responsibility is still a factor, but the multiparty collaboration among federal regulators, carmakers, and child safety seat manufacturers enabled parents to avoid common mistakes.

Such coordination among producers, consumers, and regulators for cyber safety is sorely lacking in the digital world. But perhaps the starting point is to get everyone on the same page by understanding the real differences between cybersecurity and cyber safety.

About the Author(s)

Sounil Yu

CISO and Head of Research, JupiterOne

Sounil Yu is the current CISO and head of research at JupiterOne, a cyber asset management platform. He was the former CISO-in-Residence for YL Ventures, where he worked closely with aspiring entrepreneurs to validate their startup ideas and develop approaches for hard problems in cybersecurity. Prior to that, Yu served at Bank of America as their Chief Security Scientist and at Booz Allen Hamilton where he helped improve security at several Fortune 100 companies and government agencies. He is the creator of the Cyber Defense Matrix and the D.I.E. Triad, which are helping to reshape how the industry thinks about and approaches cybersecurity. He serves on the Board of the FAIR Institute and SCVX; co-chairs Art into Science: A Conference on Defense; volunteers for Project N95; contributes as a visiting National Security Institute fellow at GMU's Scalia Law School; and advises many security startups.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights