Learning From Auditor War Stories

Stories of IT missteps and unforeseen disasters while auditors are on-site can point to important lessons for preparing for compliance and security

Dark Reading logo in a gray background | Dark Reading

Sometimes the best lessons come from cautionary tales lived by those before us who didn't get things right the first time around. And in the IT compliance world, no one is more prepared to offer up those stories than the auditors and assessors tasked with checking up on IT practices.

"These types of war stories are really the opportunity for people to learn and recognize that sometimes the extraordinary can be ordinary, and businesses need to be prepared for that," says Brian Christensen, head of global internal audit for Protiviti. "There are a plethora of these stories, and they're a great learning opportunity."

Dark Reading recently spoke with Christensen and a number of other auditors and assessors to get the lowdown on some of the biggest mistakes or disasters they've witnessed firsthand. Here are a few of their accounts, along with their insights on the lessons other organizations can take from them.

Expecting The Unexpected
People might talk about a particular audit being a disaster, but how many organizations can say they've experienced an actual disaster while the auditor was on the premises? According to Christensen, he has been around long enough to witness it happen firsthand. And as chance would have it, he and the CIO of the company he was working with were actually discussing business continuity management when it happened.

"Imagine a warehouse type of environment and his office was there, and in the background you could see a big transformer," he says, explaining that as they talked, a bird landed on the transformer, which blew up. "The power for the entire facility went out. And I remember chuckling and saying, 'This is an opportunity for you to demonstrate that your business interruption planning really works. This is not a test.'"

[Think insiders can't hurt your firm? Think again. See 8 Egregious Examples Of Insider Threats.]

According to Christensen, that was a prime example of how IT organizations should be prepared for unforeseen events to throw a wrinkle in the process. In the same vein, they should also be at a ready state in case they're subject to unanticipated observation.

"When we do surprise inspections, they're often shocked when we're able to kind of walk in," he says. In one instance when he was engaged with a global organization to do surprise inspections of remote data center locations, Christensen and his team were able to observe the parking lot for traffic patterns so that they were able to follow in workers through the doors of a sensitive site during the post-lunch rush.

"The advice that we’ve learned from that type of instance is always be prepared for the unexpected," he says.

Auditors Have Eyes And They're Always Watching
While some simple social engineering tricks often show whether employees really are living up to written policies with regard to physical access, auditors don't necessarily have to resort to gaming the system to find physical security blunders. Sometimes it's just a matter of keeping their eyes open. For example, Laura Raderman, director of security assessments for Gemini Security Solutions, discussed a facilities assessment she recently conducted on a data center that was undergoing construction.

"The construction site had one construction person on duty letting workers in and out of the site [based on a list of approved workers], but not checking photo IDs," she says. "So, basically, almost anyone could easily walk into the construction site and then into the data center."

At the same time, the main security entrance was well-guarded with numerous security guards who were checking for two forms of identification and multiple "man-traps" to enter the data center hallways.

"There was no reason to bother with the main entrance if you could get in through the construction entrance," she said.

All of this was pretty evident with a quick look around the facilities, which goes to show that often the most likely way an auditor will flag an organization isn't through any kind of complicated test, but through simple observation.

Because of that, organizations need to remember that if an auditor is on-site, they're always on the lookout for practices that give them hints that things are amiss. One big hint could be the way that IT workers make changes in response to audit queries -- "The really silly things assessors see happen when someone says, 'Oh, I'll just change that for you,'" says Walt Conway, a QSA for audit firm 403 Labs.

He recounts an episode where his audit uncovered a firewall rule that seemed out of place. The person he was working with was eager to help the audit, and so he offered to change the rule on the spot -- a big no-no given that the company had change management policies in place for an approval and ticketing process. It was what he called "a problem of excellence."

The person was qualified, good, knew what he was doing and did the right thing, "sort of," Conway says. "But it wasn't the right thing from a security point of view. It wasn't documented, there was no trail, and when he disabled the rule there wasn't a comment put that this was disabled on such and such a date."

A better alternative would have been for the employee to have told Conway that he'd write down the issue and address it during a regularly scheduled firewall update or initiate an emergency change process. As Conway puts it, organizations should be following the procedures all of the time, but "at the very least when the assessor is sitting there."

Papers, Please
During any audit, documentation is always critical to getting a compliance sign-off. So it's no surprise that many auditors relate that some of their most frustrating experiences revolve around poor preparation of documentation of procedures or architectures.

"I cannot tell you how difficult it is sometimes getting a network diagram," Conway says. "I can get wire-in diagrams, I can get lots of inventories, but a real solid, logical network diagram is something that is so basic that they sometimes overlook it," he says. "As an assessor, I don't care if they draw it on a white board and take a picture and send it to me. I just need something that will determine what's in and out of scope." As organizations dot their i's and cross their t's through the documentation and audit process, they should remember that seemingly small differences in nomenclature can make all the difference in compliance and the risk management activities that the auditor is there to help drive. For example, Christensen relates the experience of one audit client he had that learned the hard way the difference between fireproof and fire-resistant.

"Unfortunately, a large fire took place in the community and encapsulated an entire city block; much of the facility was destroyed," he says. "Even though all the documentation they had was in what they thought was a 'fireproof' cabinet, it had all been incinerated because it was so hot, and it was only fire-retardant."

No Assumptions Are Safe
One auditor Dark Reading spoke to would only recount his experiences anonymously, for fear things would get back to his client. This auditor says that the worst cases he sees usually are the product of assumptions being invalid.

"I had one client that insisted that the scope of the audit based on regulated information was limited to a single system. When we got on-site, we found the regulated information was everywhere: email, backup systems, CRM, accounting," he says. "The effort of the audit is based on the number of systems, controls, interviews, and data samples required. The effort on this one exploded to three to five times the originally estimated effort."

Similarly, just assuming that if some computers in a group are being updated and patched properly, then all of them can get an organization into trouble. Conway recounts a routine check-up at one client site of simple tasks, such as updating antivirus, installing patches, and securely configuring machines. As part of the assessment process, he asked to do a quick spot-check of six or seven computers within a much larger group of endpoints. Out of "just dumb luck" he found that one of the machines was an exception to the general group: Because it was used by a user who had another main machine and traveled a lot, it was never brought online with the rest of the machines during the patch roll out process. As a result, it was badly out of compliance.

"Nobody bothered [with] updated antivirus [or] put it on a list to be patched or anything else," he says. "As a result, everything else was being patched, and this machine was just invisible."

Conway says that this is emblematic of the mistakes that most organizations make. Generally, IT organizations are honest and willing to work with the auditors. They're just not as ready as they thought they were. He reiterates Christensen's warning about expecting the unexpected.

"Do people lie? No. I haven't had someone that said, 'Don't look at the man behind the curtain,'" he says. "Do they miss things because they're too close to them? Yeah. It's not that people are hiding things. In fact, it's quite the opposite. We find unexpected things."

Have a comment on this story? Please click "Add Your Comment" below. If you'd like to contact Dark Reading's editors directly, send us a message

About the Author

Ericka Chickowski, Contributing Writer

Ericka Chickowski specializes in coverage of information technology and business innovation. She has focused on information security for the better part of a decade and regularly writes about the security industry as a contributor to Dark Reading.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights