Perimeter
1/14/2013
10:20 AM
Wendy Nather
Wendy Nather
Commentary
Connect Directly
RSS
E-Mail
50%
50%

All Your Base Are In An Indeterminate State

Or the importance of timeliness in monitoring

Does your data need to be poppin' fresh, organic, and locally sourced? Maybe not; it depends on how and why you’re consuming it.

Organizations that have figured this out tend to have tiers of data. There’s the kind of live data that must be consumed immediately and refreshed as soon as a change comes along. (Think of this as soufflé data if you like. You have to rush it out of the oven before it falls. Or the lettuce in your refrigerator that really isn’t worth it once it wilts; you’re better off getting a new head.) Live data drives immediate responses: trading transactions, stock prices, credit card processing, industrial control data, vital signs during a medical emergency, or altitude and speed data as a plane is landing. Live data will be kept as close to the consumption point as possible and will receive most of the storage, delivery, and access resources so that it can be updated as fast as it changes.

Then there’s cruising speed data -- which you might update on a regular basis, but its timeliness isn’t as vital. For example, you could check once a day to see whether yesterday’s terminated employees had their access revoked by the evening. It’s still important, but not so much that you need up-to-the-minute reports vying for your attention. This data could be kept where it is generated and only presented as scheduled. To extend the grocery analogy, this would be the bottles of milk delivered to your door (does anyone else remember that, by the way?).

Reference data, or rarely used data, can be stored near-line or offline. These are those spices in your kitchen that you pick up every couple of years, decide that they’ve expired, and go out to get new ones. You can’t get rid of one entirely because you never know when you’re going to need nutmeg. Historical security data needs to be available for audits, or for, "Hey, haven’t we seen this before?" situations, but it should be delivered on demand and stay out of the way when it’s not needed.

Despite the attractiveness of "Big Data," don’t fall into the trap of thinking this means you can put all these types of data together in one big, honking Hadoop. It’s like getting a gigantic freezer and thinking you can now fill it up with those huge packs of chicken wings that you only eat during football season. Cloud storage has made it easier to keep reference data near-line. (I don’t consider it to be completely online if it’s free to upload, but you have to pay for restoring.)

The important thing about security monitoring data is that its timeliness and velocity depend on how quickly you can do something with it. If you have enough resources to be able to take immediate action on an alert, or if you have automation in place that can change configurations on the fly (say, generate new IPS rules), then shrinking that "real time" window of data delivery makes sense -- and there are plenty of vendors out there that claim faster and faster speeds for that. But if you can only find time to review logs once a month, then syslog and grep are probably as much as you need; don’t spend money on a fancy SIEM if you can’t drive it more often than just on Sundays.

And if you’re an organization that can’t afford log storage at all -- I know you’re out there -- and you have the equivalent of an empty fridge with a jar of mustard and two beers, and you go out for meals all the time, then think about the nutritional value of your data and what it’s costing you to have someone else dig it up for you in emergencies. This is probably why you’re not discovering that you’ve been breached until law enforcement tells you about it six months later.

Now that we have a fresh start with the new year, and you’re reorganizing your pantry and freezer anyway, you might as well review your security data storage. Look at your response requirements and capabilities and then decide what needs to be pushed to the front. While you’re at it, you might take some Windex to that "single pane of glass," if you have one for dashboards and such. Your mother -- I mean, auditor -- will be proud.

Wendy Nather is Research Director of the Enterprise Security Practice at the independent analyst firm 451 Research. You can find her on Twitter as @451wendy.

Wendy Nather is Research Director of the Enterprise Security Practice at independent analyst firm 451 Research. With over 30 years of IT experience, she has worked both in financial services and in the public sector, both in the US and in Europe. Wendy's coverage areas ... View Full Bio

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Flash Poll
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2014-3352
Published: 2014-08-30
Cisco Intelligent Automation for Cloud (aka Cisco Cloud Portal) 2008.3_SP9 and earlier does not properly consider whether a session is a problematic NULL session, which allows remote attackers to obtain sensitive information via crafted packets, related to an "iFrame vulnerability," aka Bug ID CSCuh...

CVE-2014-3908
Published: 2014-08-30
The Amazon.com Kindle application before 4.5.0 for Android does not verify X.509 certificates from SSL servers, which allows man-in-the-middle attackers to spoof servers and obtain sensitive information via a crafted certificate.

CVE-2010-5110
Published: 2014-08-29
DCTStream.cc in Poppler before 0.13.3 allows remote attackers to cause a denial of service (crash) via a crafted PDF file.

CVE-2012-1503
Published: 2014-08-29
Cross-site scripting (XSS) vulnerability in Six Apart (formerly Six Apart KK) Movable Type (MT) Pro 5.13 allows remote attackers to inject arbitrary web script or HTML via the comment section.

CVE-2013-5467
Published: 2014-08-29
Monitoring Agent for UNIX Logs 6.2.0 through FP03, 6.2.1 through FP04, 6.2.2 through FP09, and 6.2.3 through FP04 and Monitoring Server (ms) and Shared Libraries (ax) 6.2.0 through FP03, 6.2.1 through FP04, 6.2.2 through FP08, 6.2.3 through FP01, and 6.3.0 through FP01 in IBM Tivoli Monitoring (ITM)...

Best of the Web
Dark Reading Radio
Archived Dark Reading Radio
This episode of Dark Reading Radio looks at infosec security from the big enterprise POV with interviews featuring Ron Plesco, Cyber Investigations, Intelligence & Analytics at KPMG; and Chris Inglis & Chris Bell of Securonix.