Experts share tips on how to avoid the most common pitfalls in an audit

Nobody passes a security audit on the first try.

You might have your access control process fixed, but you probably haven't adequately trained your administrators on how to manage it. You might have your configuration and change control systems in place, but you probably haven't sufficiently documented the process for using them. If you've adopted strict security policies, your users likely have found a way of avoiding or bypassing them altogether.

Make no mistake -- auditors will find fault with your systems, your processes, and the people who operate them. They're auditors. It's their job.

If you only knew the most common reasons for audit failure in advance, so that you could double-check your environment and fix those potential deal-busters before the auditor comes in. If you only had some tips from experts who have "been there" on how to shore up your environment to beat an audit.

Hey, wait a minute, that's what's in this article!

The following are eight tips offered by auditors, consultants, and others who have been through the IT security audit mill on what to look for in a compliance audit and how to beat those problems before an auditor fails you on them. It's not a comprehensive "cheat sheet," but it might give you some ideas on why companies fail their audits, and what you can do to avoid the same pitfalls.

If you have any ideas or tips that we've overlooked here, please post them to the message board attached to this article. We'd love to hear about your experiences with compliance audits -- and what you'd do differently if you had them to do all over again.

Contents:

  • Page 9: Document your work and train your users on what you've done

    Next Page: Establish a consistent set of practices for change management

    There's no such thing as a static IT environment. If you're not properly and consistently keeping track of changes in your organization, you've got a big fat problem. And the lack of a formal change management process could earn you a big fat "F" on your audit report.

    Security audit experts say you need a formal document and change procedure, as well as oversight on changes -- Joe in accounting is now working from home instead of the office -- and reviews of your change logs. And you'd better know about that user who was recently fired, so you can immediately disable his account in case he has revenge in mind.

    "Three years ago, companies had really poor change management, but today, their change management process [is] moving toward automation," says Paul Proctor, a research vice president at Gartner. "But there's still a hefty number of them that don't have any change management" at all, he says.

    Auditors are typically tough on change management. IBM recommends documenting change-management policies and procedures and updating them regularly; reviewing, analyzing, and approving change requests; and testing changes before you make them, according to Robin Hogan, program manager for IBM governance and risk management.

    But monitoring change isn't as easy as it sounds: "I've seen tables full of minutes from change-board meetings, forms completed appropriately -- but no evidence that the actual change itself was appropriately implemented, or even implemented at all," Hogan says. The key is an automated change management system that tracks what changes were made and by whom, then matches them to specific systems, she says.

    Proctor says change management is more of process control issue than a technological one. Gartner recommends "change reconciliation," where you use tools like Tripwire and database monitoring to automatically detect any changes to data or files -- and then cross-check them with authorized changes.

    "If you then go back to the CMDB [change management database] and reconcile things you detected with authorized change requests," that's change reconciliation, Proctor says. "This is to address auditor concerns to prove that nothing happened that shouldn't have" to the data.

    But organizations have a ways to go on the reconciliation side -- Proctor doesn't expect it to become a regular part of the change management process for another four or five years. "The problem is you have to have tightly controlled change management, and every time you detect it, you have to go back and reconcile it."

    Next Page: Keep your app developers away from production/operations

    With many large organizations outsourcing their IT operations and software development, a clean separation between your application developers and your operational, production systems is more crucial than ever.

    "Application developers should not have access to the production environment," says Kris Lovejoy, IBM's director of strategy for governance and risk management.

    By testing code in the operational environment, developers can either slow or disrupt business operations. With so many companies using third parties to develop their custom internal apps, the production environment can be extremely vulnerable, Lovejoy says. If operations and development aren't adequately segregated, auditors will be crawling all over you, she says.

    And beware of leaving IT with indiscriminate access to systems and databases: "That includes giving a developer or programmer or database administrator access to a system that is completely unmonitored and uncontrolled to a production environment or production data," Lovejoy says. "This is why we see the market turning to identity and access management systems."

    Proctor says this lack of a separation of duties is a major problem in organizations today: Anywhere from 60 to 70 percent of Gartner's clients give their developers access to production code, and about 25 percent of its clients provide all of their administrators access to everything. "This is more the case for smaller businesses, and they are getting hammered in their audits there. Auditors are stepping in and saying 'you need to fix this.' "

    "The main reason [this problem] exists is because it got baked into the way companies do business. Their developers are the ones who put the operations code in place, and everybody has access to everything." This is a prime example of where the least-privilege approach to authorization would come in handy, he says.

    It's all about separation of duties, says David Smith, senior regulatory compliance analyst for Symantec. A better approach would be to require a manager to approve what each IT person or individual user can do. "[Sarbanes-Oxley] auditors focus heavily on separation of duties."

    "Any person who just does one thing on a system... can execute on that one system." But organizations may have to tweak the way their IT resources operate today, he says.

    Next Page: Give users access only to the data and apps they need

    Access control is even more crucial when it comes to end users, who definitely don't need access to everything. Getting a handle on this aspect of compliance requires knowing who (and where) your users are, and what privileges they have.

    IBM's Lovejoy says a lack of user access control is one of the top reasons companies fail an audit. "This is the inability to provision users effectively and administer their accounts... And take into account any changes in responsibility or to identify and revoke privileges when a user is terminated."

    User access control is closely related to change management. One of the first steps in good change management is keeping people out of places where they shouldn't be, Gartner's Proctor says. "Everybody recognizes that it's well-intentioned people who cause most of the downtime and [security] problems... Any type of change in the system is a time a flaw can be introduced."

    Identity and access management are the goal, Proctor says. "Most audit findings today say they either want you to control who has access to what, or be able to report on who has access to what," he says. A first-line manager should sign off on whether a particular user should have access to a particular system, he says.

    Manual and homegrown provisioning just doesn't cut it anymore. "Now that you've got all these identity audit requirements, you need to be going with a tool for it," Proctor says. That means deploying a tool that automates -- and tracks -- the process for you, he says.

    User access control shouldn't just be a gatekeeper function, either, Symantec's Smith says. If you think your passwords don't need to be as strong internally, think again: "It's easy [for an attacker] to plug in a laptop in the lobby of the building."

    Next Page: Shore up physical access to your systems

    What, no biometric scanner on your data center door?

    Physical access control to sensitive systems and equipment seems like a no-brainer, especially when you're preparing for an audit. But how much control do you need -- and how do you manage it?

    Whether you need locks, sign-in sheets, fingerprint scans, or smart cards depends on the number of people who need access, as well as the level of sensitivity of the data. Gartner recommends deploying the minimum physical controls: "If only two people have access to a room, a lock and key work just fine -- just give the two of them the key," Gartner's Proctor says. "You don't need smart cards then."

    If there are dozens of IT people who need access to a system, then it's time to look at a sign-in/out log, or smart cards, which also track dates and times of access.

    Gartner suggests publishing your physical-access control policies, including who's allowed where. Enforcement is also important -- with pre-defined consequences -- as well as educating employees on these policies.

    Some data-sensitive areas may require multi-factor authentication (think retinal scans), proximity cards, and video surveillance. This tightly controlled physical security should be driven by your business requirements, not by fears of an audit, according to Gartner.

    The physical security problem is often exacerbated because IT security people and physical security people don't communicate, and may even work at cross-purposes. "There's a lot less synergy there than you'd expect," Proctor says.

    Next Page: Establish methods to detect security anomalies -- and where they come from

    If you can't monitor it, you can't manage it, the old adage goes, and this is certainly true when it comes to security compliance. One of the first things that auditors will ask your company is how it knows when someone -- either inside or outside the company -- is tampering with sensitive data. You need to be able to not only answer this question, but demonstrate it onscreen.

    Until recently, most companies did their monitoring through some combination of real-time systems -- such as intrusion prevention systems (IPS) or security information management (SIM) tools -- and recursive analysis of log files to show who accessed which files, and when. SIM tools, in particular, have become a popular method of showing auditors security-related events in the enterprise, and what steps have been taken to prevent unauthorized access.

    "Most of the time, organizations already have controls and policies in place," says Indy Chakrabarti, group product manager at Symantec, which makes SIM tools. "What they need is a way to lower the cost of compliance and enforcement, and that's what our tools are designed to do." (See A Multitude of SIMs.)

    But a new class of vendors and products is also emerging for "compliance management," an idea which is sometimes expanded to include IT governance, risk, and compliance (GRC) management. These products are designed, in part, to monitor all the pieces of policy management and compliance, and warn enterprises when they are about to fall out of line with regulations or policies.

    "We look at this as an opportunity to translate business requirements into IT activities and metrics that can be measured," says IBM's Lovejoy. "Security and compliance are an important part of that, but so are business resilience and service management." Other vendors prefer to focus primarily on the compliance piece. "In most cases, the CXO is not interested in looking only at security events," says Dean Coza, director of product marketing at ArcSight. "They want to track new compliance problems, and do some baselining on how the organization is performing against policies and controls. A roles-based approach helps the company monitor not just how its systems are doing, but how its people are doing."

    Whatever you decide about monitoring tools, you need to be sure that they can demonstrate to the auditor that your organization can track who is accessing sensitive data -- and can alert the troops when unauthorized access is taking place. If you have those systems in place and tested before the auditor arrives, you'll be a leg up when the audit begins.

    Next Page: Map your security processes to real business processes

    When you're preparing for an audit, it's important to remember that the auditor's job is to find out whether your organization has sprung any security leaks -- and the audit process may differ from organization to organization. A compliance audit is not like a home inspection, where the inspector typically works from a checklist that you can review in advance. A flaw that might be overlooked in one business might cause another business to fail its audit.

    As a result, IT and security organizations should resist the temptation to measure their compliance efforts against a pre-written "checklist" of compliance issues that can be crossed off like a grocery list. Many companies that take this approach are disappointed to learn that they've failed -- because while they have met the "letter" of compliance, they haven't considered its "spirit" -- the prevention of leaks that might hurt employees, customers, or investors.

    Organizations often "try come to up with [an audit] checklist, versus looking at their business process," says Symantec's Smith. "Organizations need to be able to produce evidence that's useful to the auditor. If you do risk-based control management effectively, you can reduce the audit cycle, understand the questions they are going to ask, and be prepared."

    Auditors typically ask a lot of questions about the business -- how it operates, who has access to information, and which data is the most sensitive, experts say. In regulatory environments where IT compliance requirements are vague, such as SOX or GLBA, the auditor's evaluation of your organization's compliance will depend on your ability to prove that you are protecting your most sensitive data during the course of day-to-day business -- not on a cookie-cutter list of compliance requirements.

    If your security policy is effective and fits with the ebb and flow of information inside and outside your organization, you've got a good chance at passing your audit, experts say. But if you focus your efforts on the auditor's requirements, rather than business requirements, you may paradoxically find yourself on the wrong end of the auditor's pen.

    Next Page: Double (and triple) check your accounting processes

    One of the myths about SOX compliance is that it's all about proving the security of the organization's IT systems, experts say. But in fact, it's all about ensuring that a public company's financial data isn't tampered with -- from inside or out.

    "What we've seen recently is that nearly half of the compliance deficiencies that companies encounter are on the accounting side, while less than 5 percent are IT systems related," said John Pescatore, vice president and distinguished analyst at Gartner, at the company's security summit in Washington, D.C. last month. If your organization fails its SOX audit, it's more likely to be a flaw in the way accounting is handled than anything to do with IT, he said. (See Security's Sea Change.)

    Just a few weeks ago, the Public Company Accounting Oversight Board (PCAOB) -- a private, nonprofit entity that gives guidance to the many auditors who evaluate SOX compliance -- changed its guidelines to reflect more real-world threats around company financials, and softened some of the rules surrounding less-likely methods for tampering with financial data. (See New Rules May Ease SOX Audits.)

    "[The PCAOB is] saying, 'let's stop and think about this,' " says Patrick Taylor, CEO of Oversight, which makes software for analyzing the accuracy and security of financial transactions. "Most financial fraud is going to occur in a rush, right at the end of a reporting period, when the company finds out that it's going to have some problems with its numbers," he says. "Those are going to be changes that somebody makes to the general ledger, which are relatively easy to detect.

    "Contrast that with, say, backup," Taylor explains. "To commit financial fraud through a backup system, you'd have to gain access to the backup data, and then you'd have to have the knowledge to alter it. Then you'd somehow have to crash the operational systems so that the backup data would be put in place. That's a lot more complex, and a lot less likely, than making simple changes in the general ledger. And the audit process should reflect that."

    Under the revised PCAOB guidelines, auditors will have the freedom to focus their attention on the transaction paths that could most likely lead to fraud, instead of auditing every possible transaction path to financial data. That means that most SOX audits will be much more heavily weighted toward accounting systems and practices, and scrutiny of the enterprise-wide IT security platform will likely be reduced, Taylor suggests.

    The new rules might lighten the burden on IT, but they won't necessarily lessen the subjective nature of audits for regulations such as SOX and HIPAA, which leave a great deal of room for interpretation, says Chris Davis, manager of compliance knowledge management at Cybertrust.

    "We'll get a lot more specificity on the business requirements, but not on the IT requirements," Davis suggests.

    Next Page: Document your work and train your users on what you've done

    Compliance audits are like swinging a golf club, experts say: If you fail to follow through, you'll end up in the weeds.

    Many auditors agree that two of their most common reasons for failing a company's compliance efforts are poor documentation and poor training programs. The best security policies and practices can still fail an audit if there is no clear system for implementing and enforcing them, they say.

    "I've failed companies that passed 99 percent of the requirements but didn't do their training or documentation correctly," said Nigel Tranter, a partner at Payment Software Co., a leading Payment Card Industry Data Security Standard (PCI DSS) auditing firm, in an interview last year. (See Retailers Lag on Security Standard.)

    Most auditors start their evaluations by reading the documentation of an organization's security efforts, experts say. Poor documentation -- or no documentation on some aspect of the compliance initiative -- is like holding a red cape in front of a bull, even if the technology and practices are working well.

    Similarly, if the effort to train administrators and users on compliance is perceived to be weak, the audit worm can turn, according to those familiar with the process.

    The key to good documentation and training is to constantly monitor and review them, and keep them updated as compliance-related changes are made in systems and practices, experts say. In a study conducted last year by the IT Policy Compliance Group, the companies rated "best in class" generally were companies that checked themselves for compliance every 21 days or less; many of the laggards do a self-audit only once or twice a year.

    "What that says is that to be successful in compliance, you've got to find a way to do some automated monitoring," said Jim Hurley, managing director of the IT Policy Compliance Group and a research director at Symantec. "You can't do it all with people."

    — The Staff, Dark Reading

  • IBM Corp. (NYSE: IBM)

  • Symantec Corp. (Nasdaq: SYMC)

About the Author(s)

Kelly Jackson Higgins, Editor-in-Chief, Dark Reading

Kelly Jackson Higgins is the Editor-in-Chief of Dark Reading. She is an award-winning veteran technology and business journalist with more than two decades of experience in reporting and editing for various publications, including Network Computing, Secure Enterprise Magazine, Virginia Business magazine, and other major media properties. Jackson Higgins was recently selected as one of the Top 10 Cybersecurity Journalists in the US, and named as one of Folio's 2019 Top Women in Media. She began her career as a sports writer in the Washington, DC metropolitan area, and earned her BA at William & Mary. Follow her on Twitter @kjhiggins.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights