Attacks/Breaches

11/5/2007
03:16 AM
50%
50%

Learning From Tylenol

Are you prepared for your next security crisis? Learn these lessons before you hit the panic button

After a lot of thought, I’m fairly certain that I would never recommend my career path to an aspiring information security professional. I started out by hacking friends' bulletin board systems back in the 1980s – something society has become less tolerant about in the current day.

At age 18, I was working on the security staff at concerts and football games – at 5'7" and 135 pounds, I got a learning experience, but not the kind that necessarily improves your exploit development or network protection skills. Later, I worked in the back of an ambulance and did mountain rescue – again, experiences that taught me a lot, but not the kinds of things that help you roll out that security awareness program.

An early career path like this is definitely a lot of fun, however. And on the off chance you survive the first part of the training program and then get a job as a PC technician, you might be able to build it into an IT security career, like I did.

First, turn that PC tech job into a systems and network administrator position. Learn HTML when Mosaic came out, and turn that into your very own software development department for a major university. All of a sudden computers don’t look so bad.

Leave the university with a history degree, start your own security and web development consulting company, get recruited by a better company that’s acquired by Gartner three months later, and you might wake up one morning as an industry analyst.

Spend seven years as an industry analyst specializing in data and application security security, and you might just be able to start a blog and go into independent consulting. I hope you make it through this strange but unusual career program – then I won’t be the only washed-up-former-paramedic-ski-patroller-rock-security-ex-analyst-data-security-geek-blogger-consultant in the business.

One advantage of such a strange career path – which spans physical security and emergency response along with more traditional technical positions – is that it helps to learn lessons in the physical world that are overlooked by many who work only in IT. For example, take breach disclosure.

More often than not, when an organization is the victim of a data breach, its people fumble the response. Many organizations attempt to cover up the breach, with varying degrees of success. Once the breach is revealed, most organizations attempt to minimize the potential impact with press statements that always seem to start with, "The safety of our customers is our top concern" and end with "we don’t believe out customers are at risk." If the organization does explain how a breach occurred, it usually doesn’t release details until months – if not years – after the incident.

Is this the best approach to breach disclosure? To answer this question, let's try a little regression therapy. Take yourself back to 1982 – the year the Commodore 64 was released – and imagine what would happen if Johnson & Johnson treated the Tylenol tampering case like a modern breach disclosure.

If you don't remember it, about 15 years ago, seven people lay dead or dying in the Chicago area from some sort of poisoning, and law enforcement investigations indicated that Tylenol was involved, perhaps through contamination in the manufacturing process or through bottle-tampering in retail stores.

If Johnson & Johnson followed today's breach disclosure practices, the company would perform an internal investigation of their factories, while urging the police to avoid making any public statements. The lawyers would begin combing through business contracts and regulations to see if they had any legal obligation to disclose the tampering.

Eventually, someone in PR would learn that the police were going to go public, and Johnson & Johnson would begin notifying only the potentially affected stores in the Chicago area. The company would release a statement that they take the lives of their customers very seriously, and have engaged a consulting company to identify the scope of the problem. Sick customers would be advised to enroll themselves in a health care plan with health monitoring.

In breach-disclosure world, the police investigation would reveal that that the Tylenol was tampered with, after it was on the shelves. Johnson & Johnson would issue a press release stating that its internal security controls were not breached, and the deaths were the result of security lapses with their trusted business partners.

Company executives in our hypothetical situation would begin calling reporters and discussing how they can’t be held responsible for lapses at the retail outlets. A former employee would go public and state (on a BBS, since it’s 1982) that he or she warned the company of potential tampering with the current product line, but was fired for demonstrating techniques to management during a meeting.

More people would die, the details would never become public, and nothing would prevent future incidents with other companies.

Fortunately, Johnson & Johnson didn’t treat the Tylenol tampering case like most enterprises treat data breach disclosure. Once its product was identified as the source of the deaths, the company immediately halted all production and advertising.

Hospitals were advised, a national recall was issued, and Johnson & Johnson began a national advertising campaign to warn people from using any products containing Tylenol. The exact source of the tampering was identified and publicized, and Tylenol was reintroduced into the market with the tamper resistant packaging that’s now standard across the industry. In less than a year, J&J stock returned to normal levels and Tylenol continued to dominate the market.

The Tylenol tampering incident is now a textbook example of crisis communications – something we’re actually taught in emergency services. By accepting responsibility, placing its customers first, communicating clearly and honestly, and taking decisive action to end the incident and ease public safety concerns, Johnson & Johnson retained the trust of the public – and the market for Tylenol.

Truth, honesty, and respect for your customers are a strength, not a weakness. Crisis communications experts know that spinning an incident with anything less usually leads to far worse consequences.

— Rich Mogull is founder of Securosis LLC and a former security industry analyst for Gartner. Special to Dark Reading.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Election Websites, Back-End Systems Most at Risk of Cyberattack in Midterms
Kelly Jackson Higgins, Executive Editor at Dark Reading,  8/14/2018
Intel Reveals New Spectre-Like Vulnerability
Curtis Franklin Jr., Senior Editor at Dark Reading,  8/15/2018
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Current Issue
Flash Poll
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2018-13435
PUBLISHED: 2018-08-16
** DISPUTED ** An issue was discovered in the LINE jp.naver.line application 8.8.0 for iOS. The Passcode feature allows authentication bypass via runtime manipulation that forces a certain method to disable passcode authentication. NOTE: the vendor indicates that this is not an attack of interest w...
CVE-2018-13446
PUBLISHED: 2018-08-16
** DISPUTED ** An issue was discovered in the LINE jp.naver.line application 8.8.1 for Android. The Passcode feature allows authentication bypass via runtime manipulation that forces a certain method's return value to true. In other words, an attacker could authenticate with an arbitrary passcode. ...
CVE-2018-14567
PUBLISHED: 2018-08-16
libxml2 2.9.8, if --with-lzma is used, allows remote attackers to cause a denial of service (infinite loop) via a crafted XML file that triggers LZMA_MEMLIMIT_ERROR, as demonstrated by xmllint, a different vulnerability than CVE-2015-8035 and CVE-2018-9251.
CVE-2018-15122
PUBLISHED: 2018-08-16
An issue found in Progress Telerik JustAssembly through 2018.1.323.2 and JustDecompile through 2018.2.605.0 makes it possible to execute code by decompiling a compiled .NET object (such as DLL or EXE) with an embedded resource file by clicking on the resource.
CVE-2018-11509
PUBLISHED: 2018-08-16
ASUSTOR ADM 3.1.0.RFQ3 uses the same default root:admin username and password as it does for the NAS itself for applications that are installed from the online repository. This may allow an attacker to login and upload a webshell.