Access your Pro+ Content below.
Data reduction in primary storage (DRIPS!)
This article is part of the Vol. 11 Num. 8 October 2012 issue of Storage magazine
Although it’s become a staple of backup systems, data reduction is still just beginning to appear in primary storage systems. Here’s how it works and who’s doing it. The demand for more data storage capacity at most companies continues to grow at eye-watering rates, in some cases as high as 100% each year. And even though the cost per terabyte of storage is dropping, the net effect is one of increasing costs. Within the secondary storage market we’ve seen vendors tackling growth by implementing technologies such as data deduplication and compression. For those data types, the use of space-reduction technologies dramatically shrink the amount of real storage required and can significantly decrease costs. However, in the primary storage market we haven’t seen a widespread deployment of similar space-saving techniques. Data reduction in primary storage (DRIPS) is in its infancy, but we’re starting to see these features being added to products from the big storage vendors. Let’s take a look at the state of the DRIPS marketplace: why...
Access this Pro+ Content for Free!
Features in this issue
Although it’s become a staple of backup systems, data reduction is still just beginning to appear in primary storage systems. Here’s how it works and who’s doing it.
You can take a lot of the drudgery out of disaster recovery by using virtualization technologies for your company’s servers, storage and desktops.
A new category of storage software is emerging with apps that optimize solid-state storage to help increase I/O performance and fully realize the benefits of flash-based storage.
Find out what respondents have to say about the tiered storage practices in their organizations in the latest survey from Storage magazine.
Columns in this issue
Keeping up with solid-state storage requires some technical know-how, but sometimes flash vendors make the technology harder for users to understand.
Do we really need data scientists to parse our way through all that big data, or will programmers and engineers and admins handle things OK?
As more and more servers are virtualized in data centers, deduplication needs to play a bigger role in protecting their data.
Convergence -- the bundling of storage, compute, network and virtualization -- is already evolving with new products that redefine ease of use.