Or the importance of timeliness in monitoring

Wendy Nather, Research Director, Enterprise Security Practice

January 14, 2013

4 Min Read

Does your data need to be poppin' fresh, organic, and locally sourced? Maybe not; it depends on how and why you’re consuming it.

Organizations that have figured this out tend to have tiers of data. There’s the kind of live data that must be consumed immediately and refreshed as soon as a change comes along. (Think of this as soufflé data if you like. You have to rush it out of the oven before it falls. Or the lettuce in your refrigerator that really isn’t worth it once it wilts; you’re better off getting a new head.) Live data drives immediate responses: trading transactions, stock prices, credit card processing, industrial control data, vital signs during a medical emergency, or altitude and speed data as a plane is landing. Live data will be kept as close to the consumption point as possible and will receive most of the storage, delivery, and access resources so that it can be updated as fast as it changes.

Then there’s cruising speed data -- which you might update on a regular basis, but its timeliness isn’t as vital. For example, you could check once a day to see whether yesterday’s terminated employees had their access revoked by the evening. It’s still important, but not so much that you need up-to-the-minute reports vying for your attention. This data could be kept where it is generated and only presented as scheduled. To extend the grocery analogy, this would be the bottles of milk delivered to your door (does anyone else remember that, by the way?).

Reference data, or rarely used data, can be stored near-line or offline. These are those spices in your kitchen that you pick up every couple of years, decide that they’ve expired, and go out to get new ones. You can’t get rid of one entirely because you never know when you’re going to need nutmeg. Historical security data needs to be available for audits, or for, "Hey, haven’t we seen this before?" situations, but it should be delivered on demand and stay out of the way when it’s not needed.

Despite the attractiveness of "Big Data," don’t fall into the trap of thinking this means you can put all these types of data together in one big, honking Hadoop. It’s like getting a gigantic freezer and thinking you can now fill it up with those huge packs of chicken wings that you only eat during football season. Cloud storage has made it easier to keep reference data near-line. (I don’t consider it to be completely online if it’s free to upload, but you have to pay for restoring.)

The important thing about security monitoring data is that its timeliness and velocity depend on how quickly you can do something with it. If you have enough resources to be able to take immediate action on an alert, or if you have automation in place that can change configurations on the fly (say, generate new IPS rules), then shrinking that "real time" window of data delivery makes sense -- and there are plenty of vendors out there that claim faster and faster speeds for that. But if you can only find time to review logs once a month, then syslog and grep are probably as much as you need; don’t spend money on a fancy SIEM if you can’t drive it more often than just on Sundays.

And if you’re an organization that can’t afford log storage at all -- I know you’re out there -- and you have the equivalent of an empty fridge with a jar of mustard and two beers, and you go out for meals all the time, then think about the nutritional value of your data and what it’s costing you to have someone else dig it up for you in emergencies. This is probably why you’re not discovering that you’ve been breached until law enforcement tells you about it six months later.

Now that we have a fresh start with the new year, and you’re reorganizing your pantry and freezer anyway, you might as well review your security data storage. Look at your response requirements and capabilities and then decide what needs to be pushed to the front. While you’re at it, you might take some Windex to that "single pane of glass," if you have one for dashboards and such. Your mother -- I mean, auditor -- will be proud.

Wendy Nather is Research Director of the Enterprise Security Practice at the independent analyst firm 451 Research. You can find her on Twitter as @451wendy.

About the Author(s)

Wendy Nather

Research Director, Enterprise Security Practice

Wendy Nather is Research Director of the Enterprise Security Practice at independent analyst firm 451 Research. With over 30 years of IT experience, she has worked both in financial services and in the public sector, both in the US and in Europe. Wendy's coverage areas include IAM, application security, threat intelligence, security services, and risk management. She is a frequent speaker at various industry conferences in the US and abroad, and co-authored The Cloud Security Rules.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights