Analytics
6/19/2013
01:25 PM
Rich Mogull
Rich Mogull
Commentary
Connect Directly
RSS
E-Mail
50%
50%

Security Needs More Designers, Not Architects

The better we design the user experience, the more we reduce our risk

A few years ago I somewhat egotistically wrote Mogull's Law in a blog post. It states, "The rate of user compliance with a security control is directly proportional to the pain of the control vs. the pain of noncompliance."

A shorter version of saying this is, "Computer users will take the path of least resistance."

The overall experience of using any manmade object (physical or digital) is a direct outcome of its design. This is as true for security as it is for a chair, an automobile, or anything else. The additional challenge for security is that, unlike many other things we interact with, ideally it makes a problem disappear, and does so invisibly. Security is fundamentally about preventing unwanted outcomes, and it is one of the most difficult design problems out there.

Gunnar Peterson once said in a Securosis research meeting, "Every security control is a denial-of-service attack against your users." No one wants to enter a password, set an access control, update his anti-malware, or classify a document. Security administrators don't want to crawl through packets or examine logs; they want to stop incidents. And while I realize some of you might be thinking, "Yes, I do," hopefully you recognize you are a mutant.

In the context of Mogull's Law, this means the best security is often that which offers users the easiest path. For end users this increases compliance with security needs, and for security administrators it means they can solve security issues more quickly. This is easy to say, but translating it to design is incredibly difficult, especially since usability and security can't always be completely reconciled.

Then again, it isn't as if we have even tried. Very few security products demonstrate an acumen for user experience, and even fewer organizations devote resources to integrating design into their security control implementations. We hire architects to integrate technology and figure out where the blocks go, but rarely correlate that with human behavior and business workflow.

Consumer-focused organizations like Apple, Microsoft, and Adobe have been focusing more on the user experience side of security design for a few years, with growing success. For example, Microsoft defaults software update options for the operating system so it is more difficult for users to avoid patching than updating their systems. Apple defaults OS X to a state that adds steps if a user attempts to install software from untrusted sources, reducing the chances he is tricked into installing malware. Adobe (yes, Adobe) has dedicated considerable effort into steering users away from implementing higher-risk features of Reader and Flash, while also increasing patch effectiveness. Are these companies perfect? Not at all, but they certainly make the effort.

There are a few core design principles that can materially improve security compliance and effectiveness. They are different for the end user experience versus security administrators, but are very consistent in tone.

The goal when implementing security for end users should be to make a problem disappear, implement the security control as invisibly as possible, and steer users toward the most secure option with positive, not negative, reinforcement. Security should seamlessly integrate with their workflows, not add an obstacle that impedes their ability to get their work done.

Take BYOD, for example. If you degrade the user experience on the device due to onerous security requirements, then odds are the user will seek alternative options that eventually increase your security risk. On the other hand, if you tell users you will securely back up their device, can remote wipe it if it's lost without them losing any data, and otherwise allow them to use the device the way they want, then it's very likely they will opt-in to remote management and secure messaging. The first time you wipe a device and then tell the user it's his fault for not backing up the photos of his kids, hatred for security will spread through the company like wildfire.

A real-world example is keyless entry and push-button start for cars. Push-button start eliminates the need to take your keys out of your pocket (it's usually paired with auto-unlocking door handles when the key fob is in your pocket). Not having to manually lock the door with a key also makes it insanely easy to lock your keys inside, so the vehicles include sensors to make it nearly impossible to accidentally leave your keys inside. The manufacturers reduced the friction of using car keys, and did so in a way that also reduced the chances of user error.

For security products, the focus is a little different. The design focus is on reducing friction- - minimizing the product-specific knowledge someone needs to remember to make it work, while reducing the time required for the administrator to get their job done. The user interface of most security products is a disaster. They require you to learn the internal vocabulary of the engineers behind development and adapt to their workflow guesses. The products suck you in, forgetting that your job is to knock things off the to-do list as quickly as possible. The developers assume you will remember the ins and outs of the product, not realizing that, with a few exceptions, administrators will hop in and out of tools, and shouldn't have to remember arcane training to finish a task.

I was once sitting with the CEO of an early DLP vendor who looked at me incredulously when I informed him that, yes, his tool needs to highlight the specific data that violated a policy in the UI. In yellow. The market-leading product had the feature for a while, and security administrators loved it since it saved seconds or minutes in assessing incidents.

The better we design the user experience of security, the less we need to train users or worry about them circumventing our controls. The better we design security products, the more we increase the effectiveness and efficiency of security professionals. User experience matters, and we need more designers, not policy writers, awareness trainers, or architects.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Samreen M
50%
50%
Samreen M,
User Rank: Apprentice
7/5/2013 | 7:17:41 AM
re: Security Needs More Designers, Not Architects
Yes, security needs more designers rather than architects as it becomes increasingly a serious issue so the security needs should be addressed to protect the userGÇÖs data from unintended and unauthorized access, change or destruction.
The article is quite informative and suggestive as well to make security system better and stronger but as you said Rich, it is easy to say but difficult to translate it to design.

Samreen M
Bolee.com
macker490
50%
50%
macker490,
User Rank: Ninja
7/4/2013 | 11:32:09 AM
re: Security Needs More Designers, Not Architects
security needs STUDENTS
study attackers. this is the way to learn to block them.
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Flash Poll
Threat Intel Today
Threat Intel Today
The 397 respondents to our new survey buy into using intel to stay ahead of attackers: 85% say threat intelligence plays some role in their IT security strategies, and many of them subscribe to two or more third-party feeds; 10% leverage five or more.
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2013-2595
Published: 2014-08-31
The device-initialization functionality in the MSM camera driver for the Linux kernel 2.6.x and 3.x, as used in Qualcomm Innovation Center (QuIC) Android contributions for MSM devices and other products, enables MSM_CAM_IOCTL_SET_MEM_MAP_INFO ioctl calls for an unrestricted mmap interface, which all...

CVE-2013-2597
Published: 2014-08-31
Stack-based buffer overflow in the acdb_ioctl function in audio_acdb.c in the acdb audio driver for the Linux kernel 2.6.x and 3.x, as used in Qualcomm Innovation Center (QuIC) Android contributions for MSM devices and other products, allows attackers to gain privileges via an application that lever...

CVE-2013-2598
Published: 2014-08-31
app/aboot/aboot.c in the Little Kernel (LK) bootloader, as distributed with Qualcomm Innovation Center (QuIC) Android contributions for MSM devices and other products, allows attackers to overwrite signature-verification code via crafted boot-image load-destination header values that specify memory ...

CVE-2013-2599
Published: 2014-08-31
A certain Qualcomm Innovation Center (QuIC) patch to the NativeDaemonConnector class in services/java/com/android/server/NativeDaemonConnector.java in Code Aurora Forum (CAF) releases of Android 4.1.x through 4.3.x enables debug logging, which allows attackers to obtain sensitive disk-encryption pas...

CVE-2013-6124
Published: 2014-08-31
The Qualcomm Innovation Center (QuIC) init scripts in Code Aurora Forum (CAF) releases of Android 4.1.x through 4.4.x allow local users to modify file metadata via a symlink attack on a file accessed by a (1) chown or (2) chmod command, as demonstrated by changing the permissions of an arbitrary fil...

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
This episode of Dark Reading Radio looks at infosec security from the big enterprise POV with interviews featuring Ron Plesco, Cyber Investigations, Intelligence & Analytics at KPMG; and Chris Inglis & Chris Bell of Securonix.