This article can also be found in the Premium Editorial Download "Storage magazine: How vSphere is easing storage burdens of server virtualization."
Download it now to read this article plus other related content.
If you dig a little deeper, you'll find cloud storage has been adopted in some sectors as a data archive tier, and has been delivering cost-saving benefits for quite some time.
Every few years a technology comes along that gets hyped to the point that no one is truly sure what it means anymore. In the storage world, storage virtualization and information lifecycle management (ILM) are two of those technologies that received a lot of vendor buildup and held a lot of promise for reducing the cost and complexity of the storage infrastructure but never really played out. The latest and greatest technology to hit the peak of the hype cycle is cloud storage.
Data storage vendors aren't doing themselves any favors with all the hype and cloudy (forgive the pun) long-term vision around cloud storage, but it doesn't appear to be headed down the same road as ILM and storage virtualization. Unlike with ILM and the original vision (2001-2002 timeframe) of storage virtualization, there are some real and immediate benefits that can be derived for adopting a cloud storage tier:
- Subscription-based services that roll up as an operating expense.
- Reduced upfront and ongoing storage costs and capacity.
- Near-perfect economics, with the subscriber paying only for the capacity that's used.
- Reduced operating expenses because the service provider
- can manage the infrastructure more efficiently thanks to multitenancy and scale-out virtualized storage platforms.
- Improved business flexibility, including easy deployment of temporary capacity.
With all these benefits, why haven't users jumped on the cloud bandwagon en masse? Well, there are various concerns about information security in the cloud and some confusion as to what constitutes cloud storage. IT managers are paranoid about what could happen to corporate information once it leaves the boundaries of corporate IT, which is a good trait in an IT manager. However, if you dig a little deeper you'll find cloud storage has been adopted in some sectors as a data archive tier, and has been delivering those cost-saving benefits for quite some time.
Effective data archive tier
Cloud storage makes sense for data that needs to be retained for long periods of time, is shared by multiple users, needs to be easily accessed and requires latency tolerance, which perfectly describes the requirements for digital medical images and healthcare records. Add to those requirements the huge explosion of semistructured data in healthcare -- driven by electronic medical records and advances in content capture devices that create ever-denser images -- and it becomes clear why cloud storage has become a viable data archive tier for the healthcare industry.
There are numerous examples of regional healthcare systems sharing a centralized digital image archive to store images such as CT scans and x-rays. These shared archives can be considered private storage clouds in which archive storage services are offered to healthcare network members as a way to cost-effectively store images and medical records accessed over IP networks. Deploying cloud storage archiving within a private storage cloud mitigates security concerns, as the storage service isn't shared outside the healthcare network.
The massive growth of unstructured data isn't just a healthcare problem; Enterprise Strategy Group estimates that unstructured data will make up the vast majority of data in commercial data centers by 2012, which affects companies of all sizes in every industry. With the advent of Web 2.0 applications, increasing amounts of regulatory oversight over how data is stored and for how long, as well as growth in the use of rich media, commercial enterprises will see unstructured data growth rates that exceed anything they've experienced in the past. Most unstructured data is frequently accessed only for the first two weeks after it's created; access then slowly tapers off and stops, yet it needs to be retained for long periods of time for business or regulatory reasons. This data is a good near-term candidate for being moved off the expensive tier 1 or tier 2 system it was created and onto a public cloud-based archive tier.
While there's a huge potential for saving on both CAPEX and OPEX fronts by moving long-tail data into the cloud, IT organizations considering a public cloud archiving option need to go into any cloud storage service provider evaluation with their eyes wide open. There's a high degree of variance in the transparency provided by cloud archive service providers into how data is stored, protected and secured, as well as the user's ability to audit access and authenticity to ensure regulatory compliance.
So the technology is here, but you need to be cautious. The Internet has reached every corner of the world, effectively creating a flat global network with few, if any, barriers to connectivity. The combination of wide-area network (WAN) acceleration and ubiquitous network connectivity allows business to be conducted anywhere. On the platform front, scale-out, commodity-based platforms that provide massive scalability, parallel data transfers and economies of scale -- while maintaining ease of use and management -- are available. And the application profiles that can withstand latency associated with storing data remotely are better understood. Cloud storage can now be leveraged as part of a data storage tiering model for persistent data. And it will become that, provided vendors get past pushing hype and get down to discussing what can be delivered today.
BIO: Terri McClure is a storage analyst at Enterprise Strategy Group, Milford, Mass.
This was first published in January 2010