Capacity allocation efficiency explained

Jon Toigo provides tips on how to improve capacity allocation efficiency. Successful allocation prevents storage-related downtime and associated costs.

This Content Component encountered an error

In this tip series, Jon William Toigo offers up his five best tips for maximizing efficiency in today’s complex data storage environments. This tip covers capacity allocation efficiency, including advice on how to prevent storage-related downtime.

Most storage vendors swap the terms “capacity allocation efficiency” and “capacity utilization efficiency” in their product brochures and press announcements as though they're synonymous. However, the two terms mean distinctly different things.

In this tip, I’ll explain capacity allocation efficiency, which refers simply to a measure of how well existing stores of disk capacity are provisioned to applications and users. The objective is also simple: We want to avoid a dreaded and potentially career-limiting “disk full” error message. In a broader sense, successful allocation efficiency strategies help prevent storage-related downtime and, by extension, its associated costs.

The quest for capacity allocation efficiency also has a distinctly financial dimension. It saves money on storage, which currently accounts for between 33% and 70% of all IT hardware spending annually. To understand how, just consult the trends we've witnessed in disk drive technology for the past three decades.

The cost of a disk drive has declined by 50% approximately every 12 months since the mid-1980s, while the capacity of a drive has doubled about every 18 months since the days of disco music. To capitalize on these trends, storage administrators have sought approaches that would enable them to buy disk storage “just in time” to meet requirements. Doing so enables the most capacious drives to be purchased for the lowest possible price -- a good thing.

This strategy isn’t new; it dates back to the earliest thinking about inventory management. The typical MBA holder knows that the best way to reduce the cost of an inventory is to avoid overstocking, especially when the widgets being acquired are subject to obsolescence or spoilage, or when there's a reasonable expectation that prices will fall over time. The challenge is to stage widgets in inventory so that stock is always available when needed, but to do so in a manner that limits the retention or carrying costs of the widgets themselves. That’s Business 101, and your senior management might view your capacity allocation efficiency initiatives favorably if you frame your goals in this “business-savvy” context.

Finding your capacity allocation strategy

Efforts to improve capacity allocation efficiency generally come down to some sort of combination of technology and technique. On the technology side, vendors have been offering on-array “thin provisioning” for a few years, software that serves up space by dribs and drabs from a pool of installed capacity, monitoring consumption patterns and forecasting the need to add capacity before the shared pool is exhausted. While this sounds like the perfect automation of the old just-in-time inventory management concept, it can run afoul of several issues.

Read the entire Toigo tip series on storage efficiency

Improve cost, energy savings through capacity utilization management

Ensure business continuity with these five data protection guidelines

Achieve power efficiency with green data storage techniques

Tips for evaluating efficient storage performance

For one, the licensing fees associated with the “value-add” software providing thin provisioning functionality on an array controller increases the cost of the storage array itself -- often to such a level that it obviates the cost-savings associated with the just-in-time disk acquisition approach it enables. In truth, consumers may never realize any sort of payback (in the form of deferred disk drive acquisition) from a thin provisioning array over the useful life of the rig (about five years to seven years, on average).

Second, the thin provisioning benefit of the on-array software is entirely contingent upon the capacity forecasting algorithms used by the vendor. These algorithms are often a trade secret, so it's difficult to evaluate their efficacy before purchasing the product. However, it's not beyond the pale to envision a situation (think “margin call”) when a sudden, “non-forecastable” demand for capacity could exceed the total capacity available. When one company acquires another and decides to merge customer service databases, for example, the resulting un-forecasted demand could exceed available supply. One fan of on-array thin provisioning dismissed this possibility at a recent trade show by noting that he keeps 40% of capacity “in reserve” at all times as a hedge against this potential problem. That's tantamount to not using thin provisioning at all.

Third, on-array software-based thin provisioning is limited by the scalability of the rig itself. When the storage assets of one rig are fully provisioned and it becomes necessary to stand up another, no hardware vendor I know of has technology to “extend” its thin provisioning functionality to the next box. This creates islands of thin provisioning and increases the overall storage resource management burden by introducing multiple boxes, with multiple capacity demand forecasting engines, to monitor.

To be clear, these issues with thin provisioning, when that functionality is delivered in the form of value-add software on individual arrays of disk drives, don't invalidate the concept altogether. An alternative strategy is to virtualize storage arrays using a hardware-agnostic storage virtualization software package that delivers thin provisioning benefits across all hardware rather than isolating them to the confines of a specific rig. The problems of high array costs, non-forecastable demand events and multiple forecasting engine management are effectively fixed by such a strategy.

Whichever you route you decide on, managing capacity will still require hands-on work. There is no silver bullet solution to the capacity problem that plagues the modern-day storage industry.

BIO: Jon William Toigo is a 30-year IT veteran and is CEO and managing principal of Toigo Partners International and chairman of the Data Management Institute.

This was first published in March 2012

Dig deeper on Enterprise storage, planning and management

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchSolidStateStorage

SearchVirtualStorage

SearchCloudStorage

SearchDisasterRecovery

SearchDataBackup

Close