Busting DLP Myths

Misinformation has stopped many organizations that could have been helped by the technology from moving forward with implementations
SAN FRANCISCO -- RSA Conference 2011 -- The hype and misconceptions around data loss prevention have held back a lot of deployments within organizations that could have used the deep content inspection DLP offers, according to a panel of experts who gathered here at the RSA Conference for a DLP myth-busting session on Wednesday.

"One of my pet peeves is a lot of people I meet say DLP is too hard, you can never do it, you've got to classify all of your data by hand before you can deploy DLP, or some garbage like that," says panelist Rich Mogull, founder of analyst firm Securosis. "That's not true; when you deploy properly you can get good results. The people I know who use DLP solutions don't have those complaints. When you get out to the people who have actually used it, none of them will tell you it’s perfect -- and, believe me, it never works as well as [the vendors] tell you it's going to work -- but they tend to give you an idea of how well it really does work."

Mogull says he puts the whiners and complainers into three categories: people who didn't buy full DLP solutions and are unhappy with the results; people who bought the tool, just hit a bunch of checkboxes, and deployed it all at once without a plan; and competing vendors that would prefer you buy their products instead of DLP.

Panelist Larry Whiteside, CSO for Visiting Nurse Service of New York and a user of DLP for the past 12 months, tends to agree that DLP’s bad rap comes from the misguided.

"It's been an interesting ride. When I first started looking at DLP suites, a lot of my peers in the industry were like, 'Oh man, you have no idea what you're getting into. There's no way anybody can implement full DLP capability. It's just not something you'll be able to do,'" Whiteside said. "But, you know, I beg to differ."

Whiteside is using DLP to keep tabs on an extremely distributed environment with more than 15,000 users. He believes that many of the misconceptions about DLP stem from confusion over what exactly DLP is -- a muddling of the term that came about as a part of the typical vendor bandwagon jumping that occurred when DLP first became a hot commodity several years ago.

This category definition has been problematic for Mogull, who said in his younger years he was more militant as defining DLP only as something with deep content inspection across all channels. Now he has come to a compromise, funneling some of the less advanced tool into a “DLP lite” category. These tools may be able to inspect only a single channel, such as e-mail, lack advanced inspection techniques, and often is plagued with more false-positives.

"I'm not going to disparage it, but it just plays a different role," he said. "There are a lot of you who just want to get a sense of what is going on in the organization. You're not going to handle everything from the first day, and you can deal with false-positives. I think as long as you have some content analysis capability, that's the core part. But realistically, the less robust that is, the more difficult it is going to be for you to manage."

Whiteside explained that as organizations seek out a DLP solution, the false-positive consideration is important to keep in mind if your organization is seeking a DLP lite solution with fewer capabilities.

"That false-positive piece -- I lived with that as I went through looking at a lot of these [vendors], putting appliances in a data center, watching stuff go out, and hearing them tell me, 'Just watch: We have got so much matching, we're going to be able to do this and that,'" he said. "Then I'd get someone's phone number or something showing up as a credit card number. And I'm saying, 'Guys, that's not even close to a credit card number.'"

These vendors would come back to him and tell him it was just a matter of “tweaking” the system, but he said he felt more comfortable with a full DLP suite that didn't require so much management.

"That's what it comes down to," he said. "You have to figure out what's manageable for you and what your tolerance is to be able to touch and tweak, because it can get very overwhelming very quickly. I've got 15,000 users sending e-mail out, and as I saw these numbers ramp up very quickly, I kind of sat back, my heart began to palpitate, and I began to sweat it a little bit."

Whiteside said it is also important to think about long-term needs to prevent the necessity of a rip-and-replace of a DLP lite system a few years down the line.

"One of the things I think that is important is you need to have strategy as well," he said. "If you're trying to just solve an immediate problem, you're going to have issues because you might end up choosing the wrong product and then having to make another investment years later in a different product suite. So it’s important to have a strategy of how you want to do this moving forward."

A final consideration to smoothing DLP deployments and preventing the disasters for which these tools have gotten a bad rap is deciding when to turn on blocking capabilities. "You don't always have to get into the mindset of, 'OK, we're going to start blocking right away,' and, in fact, in many cases that can be very disruptive to the business and not the best thing," said panelist Dave Meizlik, director of product marketing and marketing communications at Websense. "Just the actions of monitoring and issuing notifications automatically can have a dramatic impact on overall risk."

In fact, Whiteside says, a year into his deployment he's still in monitoring mode only.

"It's been 12 months and we're still only monitoring, but that monitoring has gained me so much value with the business because I've been able to walk into different business units and help them," he said. "It's helped them become more efficient in many, many ways."

Have a comment on this story? Please click "Discuss" below. If you'd like to contact Dark Reading's editors directly, send us a message.