News

3/1/2011
08:43 AM
George Crump
George Crump
Commentary
50%
50%

Automatic Storage Optimization

It will come as no shock to any storage manager that the capacity of the data that you need to store is growing. The problem is that your budget is not, or at least not as fast as your need for storage. The speed of growth also means that traditional techniques may no longer be effective. You need the storage system to just handle it, in other words storage optimization needs to be automatic.

It will come as no shock to any storage manager that the capacity of the data that you need to store is growing. The problem is that your budget is not, or at least not as fast as your need for storage. The speed of growth also means that traditional techniques may no longer be effective. You need the storage system to just handle it, in other words storage optimization needs to be automatic.While there are techniques available like data migration, archiving and dare I say it, tiered storage to help with expanding capacity requirements, all of these require manual interaction to be able to get them to work properly. Over time they may still be the right way to manage storage but for right now if you are running out of capacity and budget. You need a quicker fix.

That fix is probably going to be systems with automatic storage optimization. Systems where you just turn on the optimization switch and it works. You don't have to think about it and it is so intelligent it knows what data it can be effective on and leaves other data alone. You don't have time to teach your optimization system what to optimize and what to leave out. It needs to figure it out, automatically.

For automatic storage optimization to work, it has to be delivered with minimal or zero performance impact. The performance aspect of these optimization techniques is critical to their wide spread adoption and value. If you can turn on these capabilities, free up some existing capacity and bend the curve on data growth without a penalty why wouldn't you? Automatic means not having to plan when and if optimization is going to happen. As we discussed in our article "What is Real-Time Data Compression?" and "High Performance Primary Storage Deduplication", thanks to better software techniques, tighter integration with the storage software or file system and just faster processing capabilities suppliers should be able to provide you with better storage utilization rates without causing a noticeable performance impact on active, production data.

Automatic also means that the optimization happens in real time as data is being written to the devices. Real time provides the benefit of optimization throughout the data's lifecycle. It also means you do not have to manage your storage in two states (optimized and non optimized), it is simply always optimized to its highest level.

If automatic optimization can be delivered without you having to worry about the impact, then the significant roll down hill efficiency effect occurs that makes all the other storage processes more efficient. We will dive deeper into this in an upcoming entry.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Microsoft Word Vuln Went Unnoticed for 17 Years: Report
Kelly Sheridan, Associate Editor, Dark Reading,  11/14/2017
Companies Blindly Believe They've Locked Down Users' Mobile Use
Dawn Kawamoto, Associate Editor, Dark Reading,  11/14/2017
New Locky Ransomware Takes Another Turn
Kelly Sheridan, Associate Editor, Dark Reading,  11/10/2017
Register for Dark Reading Newsletters
White Papers
Video
Cartoon
Current Issue
Managing Cyber-Risk
An online breach could have a huge impact on your organization. Here are some strategies for measuring and managing that risk.
Flash Poll
[Strategic Security Report] Cloud Security's Changing Landscape
[Strategic Security Report] Cloud Security's Changing Landscape
Cloud services are increasingly becoming the platform for mission-critical apps and data. Heres how enterprises are adapting their security strategies!
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2017-0290
Published: 2017-05-09
NScript in mpengine in Microsoft Malware Protection Engine with Engine Version before 1.1.13704.0, as used in Windows Defender and other products, allows remote attackers to execute arbitrary code or cause a denial of service (type confusion and application crash) via crafted JavaScript code within ...

CVE-2016-10369
Published: 2017-05-08
unixsocket.c in lxterminal through 0.3.0 insecurely uses /tmp for a socket file, allowing a local user to cause a denial of service (preventing terminal launch), or possibly have other impact (bypassing terminal access control).

CVE-2016-8202
Published: 2017-05-08
A privilege escalation vulnerability in Brocade Fibre Channel SAN products running Brocade Fabric OS (FOS) releases earlier than v7.4.1d and v8.0.1b could allow an authenticated attacker to elevate the privileges of user accounts accessing the system via command line interface. With affected version...

CVE-2016-8209
Published: 2017-05-08
Improper checks for unusual or exceptional conditions in Brocade NetIron 05.8.00 and later releases up to and including 06.1.00, when the Management Module is continuously scanned on port 22, may allow attackers to cause a denial of service (crash and reload) of the management module.

CVE-2017-0890
Published: 2017-05-08
Nextcloud Server before 11.0.3 is vulnerable to an inadequate escaping leading to a XSS vulnerability in the search module. To be exploitable a user has to write or paste malicious content into the search dialogue.