Cloud storage can reduce the cost of IT, but service providers must prove they have the proper data security models...
before larger firms will adopt the model en masse.
Cloud storage, as defined by Enterprise Strategy Group, is a service delivered in one of three ways: via a subscriber, one in which the service provider is a company's internal IT group (private cloud) and when a third-party company delivers storage services.
Cloud storage must be elastic so it can quickly adapt the underlying infrastructure to changing subscriber demands. It must also be automated so that underlying infrastructure changes can be made, and content can be placed on different storage tiers or in geographic locations quickly and without human intervention. Of course, the devil is in the details. The storage cloud must also be:
- SLA-driven, automated and integrated to provide quick response times to user demands
- Policy-based, with deep levels of automation to move data to appropriate tiers
- Secure, reliable and scalable to multi-petabyte (PB) capacity with a unified management view for both block and file storage.
Cloud storage economics should enable both the service provider and the subscriber to benefit. Service providers gain economies of scale with a multi-tenant infrastructure, as well as a predictable, recurring revenue stream. Benefits for subscribers include the following:
Storage costs are shifted to an operating expense. In the current economic downturn, capital dollars are much harder to come by than operational dollars. Cloud storage is a subscription-based service that rolls into operating expenses.
Pay as you use. While some of the top IT shops have storage utilization rates in the 80%-plus range, the industry average is closer to 40% to 50%. That means IT shops are housing, powering, cooling and managing lots of "spinning rust" with zero data on it. In a cloud model, users only pay for the amount of storage actually used.
Operating expenses are reduced. The service provider absorbs storage management costs, and can manage the infrastructure more efficiently thanks to multi-tenancy and scale-out virtualized storage platforms.
Your data center footprint should shrink. Subscribers don't need to worry about finding new ways to house, power, cool and manage storage capacity, which are some of the biggest challenges facing storage managers today.
You can easily determine storage tier requirements. Moving stale, rarely accessed data off tier 1 assets allows subscribers to optimize tier 1 utilization and extend its useful life, while reducing the cost of storing stale data. Cloud service-level agreements (SLAs) can also be established for applications requiring different security levels.
Business flexibility is provided with subscriber-controlled on-demand capacity and performance. New projects can be kicked off without a long, multi-month waiting period to provision new capacity. Temporary capacity can easily be deployed if a new business opportunity arises.
Remember, not all applications are suited for a cloud storage model. Latency is a key consideration (no matter how good the infrastructure provider is, no one has yet figured out how to beat the speed of light) and will most likely appear as a problem in hybrid and public clouds. The bulk of data stored in the data center in the foreseeable future will be file-based, long-tail content that's latency tolerant. Getting that data offsite to a cloud provider makes economic sense, especially in the current economy. The current wave of public cloud storage business is mostly from small- and medium-sized enterprises (SMEs), as larger enterprises are more likely to consider private and hybrid clouds. Cloud storage service providers in the market today still have a lot of work to do to prove they have the proper data security models in place before larger enterprises will adopt the model en masse, but there are clear opportunities to reduce the overall cost of IT for those who do.
BIO: Terri McClure is a storage analyst at Enterprise Strategy Group, Milford, Mass.