The bigger data sets grow, the harder compliance could become.
Just like "the cloud" of 2009 and 2010, this year's red-hot buzz term bandied about by executives who may or may not have clue what it means is "big data." But just as 2011 saw the world wrap its head around the cloud, the time is coming when technology around big data will gain traction, understanding, and deployments. And when it does, infosec professionals need to be ready for the security and compliance complications that it could potentially introduce.
So what exactly is big data? In a nutshell, it's a dataset that's too big to be crunched by traditional database tools. Whether it is from scientific or environmental sensors spewing out a cascade of data, financial systems producing a mounting cavalcade of information, or Web and social media apps that create a snowballing mass of records, big data is typically classed as such if it maintains three essential dimensions. They're what Gartner's Doug Landoll, then of META Group, back in 2001 called the 3Vs of data management: volume, variety, and velocity.
The first one's obvious, clearly something wouldn't be called big data if there wasn't a heck of a lot of it. But big data is also a swarm of unstructured data that has got to be fast to store, fast to recover, and, most importantly, fast to analyze.
"While many analysts were talking about, many clients were lamenting, and many vendors were seizing the opportunity of these fast-growing data stores, I also realized that something else was going on," Landoll wrote recently in a retrospective on that first report. "Sea changes in the speed at which data was flowing mainly due to electronic commerce, along with the increasing breadth of data sources, structures and formats due to the post Y2K-ERP application boom were as or more challenging to data management teams than was the increasing quantity of data."
When Landoll first wrote about the 3Vs 11 years ago, it was mostly addressing the data management challenges that had contributed to the evolution of data warehousing. These types of data stores gain their value mainly through analysis--which is why data warehousing and business intelligence had gone hand-in-hand for years before "big data" became common parlance.
More than 700 IT pros gave us an earful on database licensing, performance, NoSQL, and more. That story and more--including a look at transitioning to Win 8--in the new all-digital Database Discontent issue of InformationWeek. (Free registration required.)
Published: 2015-10-15 The Direct Rendering Manager (DRM) subsystem in the Linux kernel through 4.x mishandles requests for Graphics Execution Manager (GEM) objects, which allows context-dependent attackers to cause a denial of service (memory consumption) via an application that processes graphics data, as demonstrated b...
Published: 2015-10-15 Cross-site request forgery (CSRF) vulnerability in eXtplorer before 2.1.8 allows remote attackers to hijack the authentication of arbitrary users for requests that execute PHP code.
Published: 2015-10-15 Directory traversal vulnerability in QNAP QTS before 4.1.4 build 0910 and 4.2.x before 4.2.0 RC2 build 0910, when AFP is enabled, allows remote attackers to read or write to arbitrary files by leveraging access to an OS X (1) user or (2) guest account.