Manage Learn to apply best practices and optimize your operations.

Improve storage utilization rates with storage optimization, capacity reduction techniques

Learn how to improve storage utilization rates using storage optimization and capacity reduction technologies such as data deduplication, thin provisioning, analytics and storage resource management.

There are a growing number of tools and techniques on the market to help IT organizations improve their data storage utilization rates. Arun Taneja, founder and consulting analyst at Hopkinton, Mass.-based Taneja Group, details the storage optimization and capacity reduction products available from a range of enterprise data storage vendors. Read about which tools you might be missing in this FAQ interview or download the MP3 file.

Play now:

Download for later:

Improve storage utilization rates
• Internet Explorer: Right Click > Save Target As
• Firefox: Right Click > Save Link As Let's talk about ways to improve an organization's storage utilization rates. What are some of the most popular techniques administrators use to reduce the amount of stored data in their environments?

Taneja: There have been all kinds of tried-and-true solutions over the years. Probably the most popular one has been when something is not needed, you used to delete that set of files. Obviously, that has become more and more difficult, particularly in this day and age of compliance and regulation. So a lot more automated and a lot more policy-based solutions have become available. I would probably put data deduplication at the very top of that list. There are obviously other techniques like compression that have been popular for some period of time and increasingly popular in conjunction with data deduplication. Are these technologies to improve storage utilization standard on most storage management solutions today? 

Taneja: Well, they are certainly becoming standard. I would start with compression first. It is most popularly included in devices or software like backup software. So when you are doing backup, when the data comes into the backup software device, it is actually compressed and then put on tape. Many of the tape drives have built-in compression capabilities so the backup software can send the information in an uncompressed format all the way up to the tape drive and then the tape drive will squeeze it down.

Data deduplication is relatively new and has just become popular in the last four or five years. And data deduplication is a very different concept -- most applicable to backup-type information, where you are doing a lot of full backups from week to week and there's a lot of commonality between week 1 and week 2's backup, and it squeezes all of that kind of stuff out.

In the grand scheme of things I would say that data deduplication has covered 10% of the overall data that exists in the marketplace, if that. What about thin provisioning?

Taneja: Thin provisioning is a relatively new concept that is extremely powerful. I would say that a good majority of the storage vendors today have included that in their storage arrays, but it is new companies like 3PAR, Compellent and several others that actually brought the concept to bear. When you allocate a certain amount of storage in the traditional storage array systems, that amount of storage is basically taken out of the array; in other words, it's not available to any other application. So there was a huge amount of waste. Thin provisioning basically means that the storage administrator can allocate whatever amount of storage he or she wants to, to the application, and only that amount that is actually written by the application will be taken away from the storage array.

So it's an extremely efficient way of keeping the application happy, keeping the storage administrator happy, keeping the database administrator happy, and then also doing very, very good storage optimization. It is now becoming a must-have in pretty much all IT operations of any decent size and, as I mentioned, all popular storage arrays from large companies like EMC, Hitachi Data Systems, IBM, Sun, NetApp, etc., have thin provisioning capability today. What analytical tools are available for mapping, analyzing and improving storage utilization rates?

Taneja: There's actually been quite a bit of progress in that area over the past five, six, seven years. There are companies like Tek-Tools [and] Aptare, and there's a product from EMC [called Data Protection Advisor, formerly called WysDM].

And many of these products actually came in from the data protection side in that they would provide you with information or visibility into what was happening in the data protection environment.

Many of those kinds of tools have now started to extend themselves into the primary storage management side. And a lot of them have capabilities for not only the analytical side but also to then optimize the capacity. Are there any techniques or tools on the horizon that could change how admins manage storage resources and achieve capacity reduction?

Taneja: Practically all storage arrays come with some level of storage management software with it. Whether you are talking about EMC's Symmetrix or Clariion, or whether you are talking about Hitachi Data Systems' Universal systems, they all come with some type of storage management software.

What I find changing now is that there are additional third-party tools that are becoming available. For example, Virtual Instruments has a product called VirtualWisdom. IBM [has] their TotalStorage Productivity Center. EMC has another software [product] called Ionix ControlCenter and Dell has similar capabilities. I wouldn't necessarily say that they are brand new, but they keep getting better, at not only identifying where the storage optimization needs to happen, but also giving you the ability to make the transfer or whatever action needs to be taken to a common console.

Dig Deeper on Storage optimization

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.