Access your Pro+ Content below.
Storage tiering gets more automated
This article is part of the Vol. 11 Num. 8 October 2012 issue of Storage magazine
Find out what respondents have to say about the tiered storage practices in their organizations in the latest survey from Storage magazine. Tiering -- putting data on the most appropriate type of storage -- has become one of the basic best practices of effective data storage management, and 53% of our survey respondents report their shop’s storage systems are tiered. That’s about the same number we’ve seen over the last four years, so there haven’t been all that many converts. The big news isn’t how many systems are tiered, but rather how storage tiering is done: 54% of those tiering now use an automated process to handle data movement. That’s more than twice the number from last year and nearly three times as many as four years ago. The other big change is the use of solid-state storage for tier zero; 60% have flash-based tier-zero storage, more than triple the tally from three years ago. The most popular drives for tier one are still 15K rpm disks, with 40% using the SAS version and 20% sticking with Fibre Channel. Among ...
Access this Pro+ Content for Free!
Features in this issue
Although it’s become a staple of backup systems, data reduction is still just beginning to appear in primary storage systems. Here’s how it works and who’s doing it.
You can take a lot of the drudgery out of disaster recovery by using virtualization technologies for your company’s servers, storage and desktops.
A new category of storage software is emerging with apps that optimize solid-state storage to help increase I/O performance and fully realize the benefits of flash-based storage.
Find out what respondents have to say about the tiered storage practices in their organizations in the latest survey from Storage magazine.
Columns in this issue
Keeping up with solid-state storage requires some technical know-how, but sometimes flash vendors make the technology harder for users to understand.
Do we really need data scientists to parse our way through all that big data, or will programmers and engineers and admins handle things OK?
As more and more servers are virtualized in data centers, deduplication needs to play a bigger role in protecting their data.
Convergence -- the bundling of storage, compute, network and virtualization -- is already evolving with new products that redefine ease of use.