Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

News

8/25/2008
09:21 AM
George Crump
George Crump
Commentary
50%
50%

Migration Relief

In my last entry on migration migraines we discussed the challenges of moving from one primary storage provider to another and went through a few solutions. One of the best methods to make migrations easier is to keep the amount of data on primary storage at a minimum, but what do you do about archives that will grow to petabytes in size?

In my last entry on migration migraines we discussed the challenges of moving from one primary storage provider to another and went through a few solutions. One of the best methods to make migrations easier is to keep the amount of data on primary storage at a minimum, but what do you do about archives that will grow to petabytes in size?Moving from one primary storage platform to another is a fact of life. New suppliers will continue to emerge which offer compelling advantages over your current supplier, or new technologies altogether such as solid state disk may cause the need to change primary storage suppliers. The answer as I have written about is to move inactive data off primary storage as soon as you can. Keeping primary storage small not only keeps costs in line, it also makes migration between platforms easier by keeping the data set small.

The potential pitfall of aggressive archiving is that the capacity of that archive will and should get very large over time. Archives of 50 terabytes to 100 terabytes will become commonplace and over the next five or six years, PB-sized archives will not be uncommon. Data sets of this size not only make migration impossible; they make tasks like backup a challenge. Does this invalidate the disk archive concept? No. Growth is inevitable. You have to choose where you want that growth to happen and a platform to deal with that growth. As I stated in a prior post on the potential cost savings of archives, you want to grow your storage in the area where it costs the least and that area is the archive. How, then, do you deal with large archives? You need to design the archive to be a permanent fixture in the data center. The answer is scalability, massive scalability. This is where "real" archive solutions come into play, systems that were designed from the ground up to provide archiving as opposed to dense, cheap RAID boxes.

Companies like Copan Systems and Permabit all have systems that can scale close to 1 PB and both will exceed that number as drive capacities continue to grow. They get there differently. Copan Systems uses densely packed power-managed drives, while Permabit uses grid architecture to offer scale. The point is that they both can scale, offer key redundancy capabilities, and offer functionality that will routinely check the health of the data and drives in the system. They are designed for long-term retention of data. Even with all this scalability, technology will continue to march on. The ability to upgrade these architectures without moving data is going to be critical and is something for you to pay attention to. This is something that cloud storage may seem ideal for and a topic we will address tomorrow. I am participating in a Cloud Storage 101 Webinar this coming Wednesday at 3 p.m. EDT. If you are interested, please click here.

Track us on Twitter: http://twitter.com/storageswiss.

Subscribe to our RSS feed.

George Crump is founder of Storage Switzerland, an analyst firm focused on the virtualization and storage marketplaces. It provides strategic consulting and analysis to storage users, suppliers, and integrators. An industry veteran of more than 25 years, Crump has held engineering and sales positions at various IT industry manufacturers and integrators. Prior to Storage Switzerland, he was CTO at one of the nation's largest integrators.

 

Recommended Reading:

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
COVID-19: Latest Security News & Commentary
Dark Reading Staff 8/3/2020
Pen Testers Who Got Arrested Doing Their Jobs Tell All
Kelly Jackson Higgins, Executive Editor at Dark Reading,  8/5/2020
Exploiting Google Cloud Platform With Ease
Dark Reading Staff 8/6/2020
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: This comment is waiting for review by our moderators.
Current Issue
Special Report: Computing's New Normal, a Dark Reading Perspective
This special report examines how IT security organizations have adapted to the "new normal" of computing and what the long-term effects will be. Read it and get a unique set of perspectives on issues ranging from new threats & vulnerabilities as a result of remote working to how enterprise security strategy will be affected long term.
Flash Poll
The Changing Face of Threat Intelligence
The Changing Face of Threat Intelligence
This special report takes a look at how enterprises are using threat intelligence, as well as emerging best practices for integrating threat intel into security operations and incident response. Download it today!
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2020-16219
PUBLISHED: 2020-08-07
Delta Electronics TPEditor Versions 1.97 and prior. An out-of-bounds read may be exploited by processing specially crafted project files. Successful exploitation of this vulnerability may allow an attacker to read/modify information, execute arbitrary code, and/or crash the application.
CVE-2020-16221
PUBLISHED: 2020-08-07
Delta Electronics TPEditor Versions 1.97 and prior. A stack-based buffer overflow may be exploited by processing a specially crafted project file. Successful exploitation of this vulnerability may allow an attacker to read/modify information, execute arbitrary code, and/or crash the application.
CVE-2020-16223
PUBLISHED: 2020-08-07
Delta Electronics TPEditor Versions 1.97 and prior. A heap-based buffer overflow may be exploited by processing a specially crafted project file. Successful exploitation of this vulnerability may allow an attacker to read/modify information, execute arbitrary code, and/or crash the application.
CVE-2020-16225
PUBLISHED: 2020-08-07
Delta Electronics TPEditor Versions 1.97 and prior. A write-what-where condition may be exploited by processing a specially crafted project file. Successful exploitation of this vulnerability may allow an attacker to read/modify information, execute arbitrary code, and/or crash the application.
CVE-2020-16227
PUBLISHED: 2020-08-07
Delta Electronics TPEditor Versions 1.97 and prior. An improper input validation may be exploited by processing a specially crafted project file not validated when the data is entered by a user. Successful exploitation of this vulnerability may allow an attacker to read/modify information, execute a...