02:30 PM
Yaron Galant
Yaron Galant
Connect Directly
E-Mail vvv

It's Not What You Know, It's What You Can Prove That Matters to Investigators

Achieving the data visibility to ensure you can provide auditors with the information they need after a breach, and do so in just a few days, has never been more difficult.

Speaking at a recent conference, Heather Adkins, Google's information security manager, posed a question to the audience that every organization should ask itself: "The question is not whether or not you're going to get hacked, but are you ready? Are you going to be able to very quickly make decisions about what to do next?"

When a breach occurs, you must do more than confirm you have implemented the appropriate information security technologies. You will also have to demonstrate you have complete visibility into the location of sensitive data, who has accessed it, and how they used and shared those files.

And you will have to do so more quickly than you might expect.

Research shows that it requires security and IT team days, even weeks, to identify the cause of a breach, and determine which files may have been exposed. The Ponemon Institute reports that it takes an average of 191 days to identify a breach, and more than two months (66 days) to contain it. That's not going to fly with lawmakers and regulatory agencies.

Consider the looming May 25, 2018, deadline to demonstrate compliance with the EU's General Data Protection Regulation (GDPR), which establishes strict requirements for protecting customer data. There are two key components to GDPR that create anxiety among security professionals and compliance officers: the broad definition of what constitutes publicly identifiable information, and the short time frame to report a breach — within 72 hours after discovery.

In the US, the North Carolina state legislature is considering an update to the Act to Strengthen Identity Theft Protections that requires that a breached entity notify affected consumers and the attorney general's office within 15 days. You can rest assured that other states will follow suit.

To make matters worse, achieving the necessary level of data visibility to ensure you are able to provide auditors with the information on specific files that may have been exposed in a breach, and do so in just a few days, has never been more difficult.

For one, there is more technology creating more data — and most of it is sensitive. Second, a typical organization has sensitive information widely distributed across its network, both on-premises and in the cloud. Your organization may have customer information stored in content systems and business applications like SAP, SharePoint, OpenText, Oracle, Box, Office 365, and many more.

These challenges pertain only to the content that stays in your organization. In today's environment, it has become virtually impossible to get business done without sharing confidential information with partners and other third-party services providers.

Consider a loan application received by a bank that must be routed to multiple third parties to conduct background and credit checks as well as a property appraisal, and secure the tax and lien history. Or a physician who forwards a patient's medical records to a colleague for a second opinion and then sends the agreed-upon treatment plan to the patient's insurance company. These kinds of use cases occur all the time.

Managing and monitoring who is accessing sensitive information and what they're doing with it is seldom done efficiently — if done at all — and yet these are critical to data security and regulatory compliance. 

While IT has ceded the control it once had over enterprise content, it remains responsible for protecting sensitive information from both outside hackers and the rising insider threat — either the malicious actor who steals information or the innocent employee whose mistake accidentally exposes sensitive data.

The trouble is, security has become anathema to efforts to improve business agility and employee productivity levels. End users are increasingly interrupted with notifications and requests from security solutions to install updates and perform system sweeps, disrupting workflows and hurting employee productivity.

This constant tug of war between IT's efforts to secure data and users' needs to share data has given rise to Shadow IT. Users embrace consumer (read: unsecure) solutions without IT's permission (or knowledge) to share files by downloading them to USB thumb drives, or uploading them to a cloud-based service like Dropbox. Users are drawn to these solutions for their functionality and ease of use; however, IT loses critical visibility over the movement and usage of files. The problem boils down to this: you can't secure what you can't see.

IT must have the controls necessary to demonstrate compliance with internal policies and industry standards. There are three key steps you can take to ensure you are able to supply information on file activity in an auditable format to internal auditors, government regulators, and/or legal teams when a security incident happens:

  1. Monitor all sensitive data: You must gain an understanding of how your employees and partners are accessing, using, and sharing your organization's sensitive data.
  2. Create detailed audit logs that show all content activity: Knowing exactly who accessed your content, when they accessed it, and how they used it will enable you to demonstrate your organization's compliance with rigorous industry regulations.
  3. Keep content where it belongs: Instead of creating duplicate copies of content for the purposes of collaboration, integrate with the applications that create the content to manage the associated workflows. 

Simply showing investigators your security system will not relieve you of the responsibility of a data breach and the resulting financial and legal consequences. Identifying the root cause of a security incident, mitigating the damage, and demonstrating full compliance requires you and your IT department know the details of the location, access controls, and activities around every single file, including how it may be shared externally. That is critical to quickly and effectively identifying the root cause of a security incident, mitigating the damage, and demonstrating full compliance.

Related Content:


Black Hat Asia returns to Singapore with hands-on technical Trainings, cutting-edge Briefings, Arsenal open-source tool demonstrations, top-tier solutions and service providers in the Business Hall. Click for information on the conference and to register.

Yaron Galant joined Accellion in 2017 and brings 25 years of experience in product strategy, management, and development. A pioneer in security and analytics, Mr. Galant has played a leading role in the creation of the Web application security space and ... View Full Bio
Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
Who Does What in Cybersecurity at the C-Level
Steve Zurier, Freelance Writer,  3/16/2018
New 'Mac-A-Mal' Tool Automates Mac Malware Hunting & Analysis
Kelly Jackson Higgins, Executive Editor at Dark Reading,  3/14/2018
IoT Product Safety: If It Appears Too Good to Be True, It Probably Is
Pat Osborne, Principal - Executive Consultant at Outhaul Consulting, LLC, & Cybersecurity Advisor for the Security Innovation Center,  3/12/2018
Register for Dark Reading Newsletters
White Papers
Current Issue
How to Cope with the IT Security Skills Shortage
Most enterprises don't have all the in-house skills they need to meet the rising threat from online attackers. Here are some tips on ways to beat the shortage.
Flash Poll
[Strategic Security Report] Navigating the Threat Intelligence Maze
[Strategic Security Report] Navigating the Threat Intelligence Maze
Most enterprises are using threat intel services, but many are still figuring out how to use the data they're collecting. In this Dark Reading survey we give you a look at what they're doing today - and where they hope to go.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
Published: 2017-05-09
NScript in mpengine in Microsoft Malware Protection Engine with Engine Version before 1.1.13704.0, as used in Windows Defender and other products, allows remote attackers to execute arbitrary code or cause a denial of service (type confusion and application crash) via crafted JavaScript code within ...

Published: 2017-05-08
unixsocket.c in lxterminal through 0.3.0 insecurely uses /tmp for a socket file, allowing a local user to cause a denial of service (preventing terminal launch), or possibly have other impact (bypassing terminal access control).

Published: 2017-05-08
A privilege escalation vulnerability in Brocade Fibre Channel SAN products running Brocade Fabric OS (FOS) releases earlier than v7.4.1d and v8.0.1b could allow an authenticated attacker to elevate the privileges of user accounts accessing the system via command line interface. With affected version...

Published: 2017-05-08
Improper checks for unusual or exceptional conditions in Brocade NetIron 05.8.00 and later releases up to and including 06.1.00, when the Management Module is continuously scanned on port 22, may allow attackers to cause a denial of service (crash and reload) of the management module.

Published: 2017-05-08
Nextcloud Server before 11.0.3 is vulnerable to an inadequate escaping leading to a XSS vulnerability in the search module. To be exploitable a user has to write or paste malicious content into the search dialogue.