This article is part of an Essential Guide, our editor-selected collection of our best articles, videos and other content on this topic. Explore more in this guide:
2. - Refuting common capacity containment methods: Read more in this section
- Thin provisioning storage: Benefits and drawbacks
Explore other sections in this guide:
- 1. - Making the most of disk storage: How not to waste space
- 3. - Some capacity fixes only work short-term
- 4. - How the right information lifecycle strategy can save capacity
Thin provisioning continues to capture the attention of storage managers as a way to manage capacity allocation. The term thin provisioning refers to a technology for allocating space to applications or end users "just in time," or as their actual storage requirements dictate.
Arguments abound over the origins of thin provisioning, though in the distributed computing world, DataCore Software appears to enjoy bragging rights over newcomers such as Compellent (acquired by Dell) and 3PAR (acquired by HP). What really matters, of course, is the functionality of the solution, which is intended to make storage more economical by permitting storage managers to buy disk drives as they are needed, rather than carrying more expensive disk inventory than the firm actually needs.
Until recently, the cost of a disk drive dropped by half every 12 months, while capacity doubled every 18 months or so. Since disk will cost half as much next year than it sells for today, it could be argued that it should be purchased as close to the time of its actual use as possible. Thin provisioning was intended to enable this strategy by monitoring capacity usage and alerting storage managers well in advance of the need to add more disk to the physical capacity pool (an array or group of arrays).
In one case study offered by a vendor of thin-provisioning arrays, a reference customer -- a law firm with 80 attorneys -- allocated a terabyte of thin-provisioned capacity to each lawyer from his 10 TB array -- in effect, oversubscribing the array by 700%. That might be regarded as daring, but in fact, each lawyer received only a small fraction of the allocated terabyte. Courtesy of value-add software on the array, additional capacity requirements are forecast over time and, when more space is needed, it is provided out of a shared pool of disk in small increments.
Presumably, if the forecasting model is good, a storage manager can defer adding more disk to the array until it is absolutely needed. Capacity requirements would be met while driving cost down significantly.
The problem with this strategy is, first, its dependence on the forecasting algorithm, which cannot predict a "margin call" event. As in the financial world, a "margin call" is a nonstandard, unexpected request for the full complement of storage that one or more users believe they already possess. If such a margin-call event occurs and the physical storage inventory lacks sufficient real capacity to fulfill the request, career-limiting outcomes may result (e.g., application abends, "disk full" error messages and downtime, among other issues).
Another limitation of the strategy has to do with how thin-provisioning functionality is implemented. The oversubscription problem can be alleviated to a certain extent by spreading the thin-provisioned capacity across many disk arrays (thereby presenting a larger-capacity inventory) rather than isolating the thin-provisioning function to a single array or stand of disks. DataCore Software does this by implementing thin provisioning across the entire storage infrastructure -- that is, the part of the infrastructure that is virtualized by DataCore and configured as a shared storage pool.
In my final analysis, thin provisioning does nothing to address capacity management per se, but it can help to reduce the costly acquisition of hard disk drives until such acquisitions are merited by actual need. One caveat: As noted previously, current dynamics in the disk market are not producing capacity improvements or price reductions at the same rates experienced since the early 1980s. If capacity and price dynamics do not return to their pre-2010 levels, the argument for thin provisioning would really lose its steam.