News

3/1/2011
08:43 AM
George Crump
George Crump
Commentary
50%
50%

Automatic Storage Optimization

It will come as no shock to any storage manager that the capacity of the data that you need to store is growing. The problem is that your budget is not, or at least not as fast as your need for storage. The speed of growth also means that traditional techniques may no longer be effective. You need the storage system to just handle it, in other words storage optimization needs to be automatic.

It will come as no shock to any storage manager that the capacity of the data that you need to store is growing. The problem is that your budget is not, or at least not as fast as your need for storage. The speed of growth also means that traditional techniques may no longer be effective. You need the storage system to just handle it, in other words storage optimization needs to be automatic.While there are techniques available like data migration, archiving and dare I say it, tiered storage to help with expanding capacity requirements, all of these require manual interaction to be able to get them to work properly. Over time they may still be the right way to manage storage but for right now if you are running out of capacity and budget. You need a quicker fix.

That fix is probably going to be systems with automatic storage optimization. Systems where you just turn on the optimization switch and it works. You don't have to think about it and it is so intelligent it knows what data it can be effective on and leaves other data alone. You don't have time to teach your optimization system what to optimize and what to leave out. It needs to figure it out, automatically.

For automatic storage optimization to work, it has to be delivered with minimal or zero performance impact. The performance aspect of these optimization techniques is critical to their wide spread adoption and value. If you can turn on these capabilities, free up some existing capacity and bend the curve on data growth without a penalty why wouldn't you? Automatic means not having to plan when and if optimization is going to happen. As we discussed in our article "What is Real-Time Data Compression?" and "High Performance Primary Storage Deduplication", thanks to better software techniques, tighter integration with the storage software or file system and just faster processing capabilities suppliers should be able to provide you with better storage utilization rates without causing a noticeable performance impact on active, production data.

Automatic also means that the optimization happens in real time as data is being written to the devices. Real time provides the benefit of optimization throughout the data's lifecycle. It also means you do not have to manage your storage in two states (optimized and non optimized), it is simply always optimized to its highest level.

If automatic optimization can be delivered without you having to worry about the impact, then the significant roll down hill efficiency effect occurs that makes all the other storage processes more efficient. We will dive deeper into this in an upcoming entry.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Want Your Daughter to Succeed in Cyber? Call Her John
John De Santis, CEO, HyTrust,  5/16/2018
Don't Roll the Dice When Prioritizing Vulnerability Fixes
Ericka Chickowski, Contributing Writer, Dark Reading,  5/15/2018
New Mexico Man Sentenced on DDoS, Gun Charges
Dark Reading Staff 5/18/2018
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: "Security through obscurity"
Current Issue
Flash Poll
[Strategic Security Report] Navigating the Threat Intelligence Maze
[Strategic Security Report] Navigating the Threat Intelligence Maze
Most enterprises are using threat intel services, but many are still figuring out how to use the data they're collecting. In this Dark Reading survey we give you a look at what they're doing today - and where they hope to go.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2018-8142
PUBLISHED: 2018-05-21
A security feature bypass exists when Windows incorrectly validates kernel driver signatures, aka "Windows Security Feature Bypass Vulnerability." This affects Windows Server 2016, Windows 10, Windows 10 Servers. This CVE ID is unique from CVE-2018-1035.
CVE-2018-11311
PUBLISHED: 2018-05-20
A hardcoded FTP username of myscada and password of Vikuk63 in 'myscadagate.exe' in mySCADA myPRO 7 allows remote attackers to access the FTP server on port 2121, and upload files or list directories, by entering these credentials.
CVE-2018-11319
PUBLISHED: 2018-05-20
Syntastic (aka vim-syntastic) through 3.9.0 does not properly handle searches for configuration files (it searches the current directory up to potentially the root). This improper handling might be exploited for arbitrary code execution via a malicious gcc plugin, if an attacker has write access to ...
CVE-2018-11242
PUBLISHED: 2018-05-20
An issue was discovered in the MakeMyTrip application 7.2.4 for Android. The databases (locally stored) are not encrypted and have cleartext that might lead to sensitive information disclosure, as demonstrated by data/com.makemytrip/databases and data/com.makemytrip/Cache SQLite database files.
CVE-2018-11315
PUBLISHED: 2018-05-20
The Local HTTP API in Radio Thermostat CT50 and CT80 1.04.84 and below products allows unauthorized access via a DNS rebinding attack. This can result in remote device temperature control, as demonstrated by a tstat t_heat request that accesses a device purchased in the Spring of 2018, and sets a ho...