Traditional storage vendors are running headlong into the realm of server virtualization, alternately known as utility, blade or autonomic computing. The latest storage vendors to illustrate this trend? EMC, for one, with its $635 million purchase of VMware; and Veritas, with a slew of server software-related acquisitions, the latest being Ejasent in January, for $59 million.
VMware makes virtual machine software that addresses the problem of Intel server proliferation, server administration and provisioning software and availability software. Ejasent also makes application availability software.
Sounds great, but what does that have to do with storage? Quite a bit, it turns out. And whether you think the idea of storage vendors vying for this space is innovative or foolhardy, the fact remains that before anyone can call utility computing a success, vendors have a lot of work ahead of them integrating virtual servers and storage.
At EMC, a first order of business may be to help make VMware's virtual machine software more network-storage friendly. Today, in order for the virtual servers enabled by VMware technology to work with networked storage resources, e.g., SAN arrays, "all the servers must be connected to all the storage," says Chris Gahagan, EMC vice president of storage infrastructure software.
This state of affairs has two potential downsides. On the one hand, it seriously limits the scalability of the virtual environment," Gahagan says. On the other, it leaves you "with a wide-open SAN," which, he admits, "no one wants."
Going forward, the goal is to be able to rezone the SAN on the fly by implementing automated LUN masking between a virtual machine and the storage, explains Mike Mullany, VMware's vice president of marketing. Further on down the road, the goal will be to use an array's built-in LUN masking capabilities, Mullany says.
Clearly, these are "fundamental things that have to happen in order for the data center to really use [virtual machine technology] this in their environment," EMC's Gahagan says.
Another theory is that EMC will use VMware technologies largely to make "a bigger, better, badder disaster recovery and business continuance story," says John Webster, founder of the Data Mobility Group.
Chew on this for a while: "In essence, a server is a bunch of transient data--but live data," says EMC's Gahagan. "If we can find a way to take that data out of the server," take a snapshot of it, as it were, by freezing its memory images, "we'll be able to bring it back in seconds."
Taking a snapshot of a server's state is fundamentally what VMware's VMotion does. Combined with EMC's existing business continuity software such as SRDF, which makes array-level mirrors of application data, EMC should theoretically be able to dramatically expand the breadth of its DR offerings. And that, Gahagan says "couldn't have happened unless a storage company and a virtualization company got married."
Veritas, meanwhile, is quietly paving the road to utility computing with a series of technology acquisitions, plus enhancements to its file system and volume management products. Last month, the company announced Storage Foundation 4.0, the bundle of its Veritas File System and Veritas Volume Manager products that was previously known as Veritas Foundation Suite.
Among the "dozens and dozens" of new features and enhancements to Storage Foundation, a few stand out in terms of utility computing, says Martin Ward, Veritas director of product management.
The first is a feature Veritas calls a portable data container, which lets you store data independently of an application (and operating system, thanks to Volume Manager's native understanding of different operating system file formats). Thanks to the portable data container, "the volume manager always knows exactly where the data is sitting," Ward says.
Ward also points out two other Storage Foundation features: enhanceddynamic multipathing for better performance and availability and a 64-bit, rather than 32-bit file system. As a 64-bit file system, Veritas File System can now be a jaw-dropping 10,000,000TB (10 exabytes), up from a paltry 32TB.
"When you start talking about utility computing, you need to not only automate your storage, but you also need to be able to scale to a truly massive environment."