Value-add software can hinder capacity utilization efficiency

Jon Toigo, CEO and managing principal of Toigo Partners International, and chairman of the Data Management Institute, examines the role of value-add software in data storage optimization in this fifth tip in his seven-part series on the 'storage infrastruggle' -- his term for the storage technology challenges faced by admins. Watch the video above or read the text below to learn what he has to say about value-add practices that impact utilization efficiency or output.

Please read Toigo's entire video-tip series on data management issues

What to consider when evaluating strategies for managing data growth

Storage capacity requirements raised by server virtualization adoption

Disk allocation issues: Too much hard disk increases costs, decreases efficiency

Performance and capacity requirements should determine data storage hierarchy

Cloud storage issues a distraction from solving real storage problems

Tackling data storage challenges a major hurdle for IT leadership

Value-add software provides a way for data storage vendors to differentiate increasingly commoditized storage products, and to "lock in" consumers to a particular software product from a particular vendor.

Invariably, the benefits accrued to the value-add technologies are portrayed as far more compelling than the challenges they introduce. Who doesn’t want a 70% more space-efficient storage array (using deduplication to reduce data stored on the drives in the array may provide data reduction ratios of 70:1, according to one leading vendor). That such a rig offers 30 1-TB SATA drives (available for $79 to $100 each, or $3,000 in total) for a MSRP of "only" $410,000, not including a warranty and maintenance agreement, is beside the point.

Moreover, if auto tiering, on-array thin provisioning or short stroking will help "move the needle" on critical business processes in terms of utilization efficiency or output, who cares if the technology is more expensive, obstructs infrastructure management, or provides only tactical and short-term fixes for the strategic issues that comprise the storage infrastruggle? We all should care about the long-term outcome.

For the most part, value-add software focuses on capacity allocation efficiency: delivering optimized capacity to store as much data on disk as possible. Deduplication and compression seek to store the most data (in a reduced form) on the least amount of space. Auto tiering seeks to free up precious high-performance storage by moving data as quickly as possible to capacity storage when its frequency of re-reference falls below a certain threshold. Thin provisioning seeks to provide capacity to applications on a "just in time" basis to minimize the need to support and power more disk drives than the company requires, given current forecasted demand rates.

All these functions focus on capacity allocation issues. While an important dimension of storage management, it could be argued that capacity allocation efficiency is a tactical matter intended to slow the rate of capacity growth until a solution can be found for the root cause of storage inefficiency and cost: unmanaged data.

Planners need to find ways to position the right data on the right kinds of media based on data utilization characteristics and business context on the one hand, and storage platform characteristics and costs on the other. Capacity utilization efficiency is a strategic goal, while capacity allocation efficiency targets strictly tactical matters.

View All Videos

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.