Access your Pro+ Content below.
Primary storage dedupe: Requisite for the future
This article is part of the Storage magazine issue of Vol. 9 Num. 6 September 2010
Tools like automated tiering and thin provisioning help users cope with capacity demands; but more drastic measures, like primary storage data reduction, are needed. Ten years ago, 10 TB was considered a large storage environment. Now it's common to have hundreds of terabytes and there are even environments with petabytes of storage in the double-digit range. It's safe to assume that data storage capacity growth will continue over the next 10 years as storage environments measured in exabytes begin to emerge and, over time, become mainstream. I actually talked to one customer who claimed they would have an exabyte of data in the next three years. Having that much physical storage in the data center is ultimately untenable. So how do we solve the problem? A big part of the answer will be provided through a number of technologies. Hard disk drives will continue to become denser. Higher capacity disk drives have the ability to store more data within the same given physical space. However, fatter disk drives impact application ...
Access this PRO+ Content for Free!
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Features in this issue
RAID has taken criticism that it can't stand up to the rigors of a modern data storage environment. But 96% of the respondents to our survey said they rely on some form of RAID.
Storage performance issues are often not related to the storage system at all, but rather to the storage network that links servers to disk arrays. These 10 tips will help you find and fix the bottlenecks in your storage network infrastructure.
The fifth edition of our service and reliability survey for midrange arrays shows that users of midrange storage systems are pretty darned satisfied with thier purchases.
Companies of all sizes are being inundated with unstructured data that's straining the limits of traditional file storage. File virtualization can pool those strained resources and provide for future growth.
Columns in this issue
Storage vendors have been busy creating server-to-application product stacks. It looks like the type of ploy that will give them more leverage, and take it away from you.
Tools like automated tiering and thin provisioning help users cope with growing capacity demands; but more drastic measures, like primary storage data reduction, are needed.
Learn about a handful of key technologies that can help storage managers meet their backup recovery time objectives (RTOs) by making the first steps -- data capture and transfer -- simpler and more efficient.
Information lifecycle management faded into oblivion without getting serious notice. But it's back now, with a new name and more realistic goals.