Perimeter
2/22/2012
11:19 AM
Adrian Lane
Adrian Lane
Commentary
50%
50%

Can You Delete A Database?

Data and databases keep growing, but there's a security tradeoff

When was the last time you deleted a database -- not accidentally, but on purpose? Have you ever willfully deleted a database? How about removed sensitive data from one?

Most database administrators I've spoken with have never retired the contents of a database. They may migrate the contents of the old database into a newly architected repository, but seldom have they just deleted a database. Or parsed out old data lying around that was clearly obsolete, or possibly truncated tables of sensitive data. DBA's are trained to keep data consistent and make sure the data can be recovered in case of emergency. It's there job, and there is legitimate fear of being fired if you can't produce data when it's requested.

But from a security perspective removing old data is a simple security precaution. Why do I recommend this approach? First off, you can't steal what's not there. If you deleted it from the database and only keep an encrypted tape backup, you're better off if your systems are breached. Second, it's an inexpensive security option that requires no special products and no additional purchases. And as an added bonus, shrinking a database means smaller storage requirements and less overhead on queries, both of which improve performance.

The real problem is this scares the heck out of database administrators. What happens if someone actually wants that data a year from now? Could you recover it? Do you even know who owns it to ask if you can delete it? What if it was subject to regulatory controls you're not aware of? No, it's easier just to keep the data.

And in this day and age where IT keeps more databases, and collects every tidbit of data they can, databases are growing. We collect more data and look for new ways to derive information from it. More data means more information, resulting in better decisions that hopefully provide some competitive sales advantage. Conceptually, anyway. Some firms are under strict regulatory controls to keep data for five- seven-, or even ten years. But studies show data used for analytics purposes goes "bad" -- as much as 30 percent -- after after just 18 months. For your reports that means "Garbage in, Garbage out."

But unlike garbage, bad data does not smell, so DBA's have no good incentive to get rid of it. Until you're breached, that is.

Adrian Lane is an analyst/CTO with Securosis LLC, an independent security consulting practice. Special to Dark Reading. Adrian Lane is a Security Strategist and brings over 25 years of industry experience to the Securosis team, much of it at the executive level. Adrian specializes in database security, data security, and secure software development. With experience at Ingres, Oracle, and ... View Full Bio

Comment  | 
Print  | 
More Insights
Register for Dark Reading Newsletters
White Papers
Cartoon
Current Issue
Flash Poll
Title Partner’s Role in Perimeter Security
Title Partner’s Role in Perimeter Security
Considering how prevalent third-party attacks are, we need to ask hard questions about how partners and suppliers are safeguarding systems and data.
Video
Slideshows
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2015-0547
Published: 2015-07-04
The D2CenterstageService.getComments service method in EMC Documentum D2 4.1 and 4.2 before 4.2 P16 and 4.5 before P03 allows remote authenticated users to conduct Documentum Query Language (DQL) injection attacks and bypass intended read-access restrictions via unspecified vectors.

CVE-2015-0548
Published: 2015-07-04
The D2DownloadService.getDownloadUrls service method in EMC Documentum D2 4.1 and 4.2 before 4.2 P16 and 4.5 before P03 allows remote authenticated users to conduct Documentum Query Language (DQL) injection attacks and bypass intended read-access restrictions via unspecified vectors.

CVE-2015-0551
Published: 2015-07-04
Multiple cross-site scripting (XSS) vulnerabilities in EMC Documentum WebTop 6.7SP1 before P31, 6.7SP2 before P23, and 6.8 before P01; Documentum Administrator 6.7SP1 before P31, 6.7SP2 before P23, 7.0 before P18, 7.1 before P15, and 7.2 before P01; Documentum Digital Assets Manager 6.5SP6 before P2...

CVE-2015-1966
Published: 2015-07-04
Multiple cross-site scripting (XSS) vulnerabilities in IBM Tivoli Federated Identity Manager (TFIM) 6.2.0 before FP17, 6.2.1 before FP9, and 6.2.2 before FP15, as used in Security Access Manager for Mobile and other products, allow remote attackers to inject arbitrary web script or HTML via a crafte...

CVE-2015-2964
Published: 2015-07-04
NAMSHI | JOSE 5.0.0 and earlier allows remote attackers to bypass signature verification via crafted tokens in a JSON Web Tokens (JWT) header.

Dark Reading Radio
Archived Dark Reading Radio
Marc Spitler, co-author of the Verizon DBIR will share some of the lesser-known but most intriguing tidbits from the massive report