What constitutes reality and fantasy in the virtualization arena?
The reality is that virtualization in the network is inevitable; the fantasy is that vendors will offer fully integrated solutions based on open standards, according to one storage analyst. But some users say that until that fantasy becomes reality, they don't plan on buying in.
"Some of us in the customer community are looking forward to a realistic [virtualization] solution that can scale across arrays of not only the same vendor, but multiple storage vendors' offerings," said Joe Goins, open systems storage engineer for The Boeing Company. "It will be at that point that we can truly commoditize storage," he added.
Virtualization, if it works as fantasized, should be able to mask the differences between different storage solutions, he said. But it may be some time before the storage networking industry is able to achieve storage neutrality and it's unlikely to happen without standards.
Virtualization is an "enabler" that makes such applications as storage resource management and policy-based administration possible, said Randy Kerns, partner at The Evaluator Group. But users may not be looking at virtualization realistically in terms of standards and interoperability.
"Standards take two years after you get a proposal that committees agree on. There probably never will be a standard. There's not one for memory virtualization for example," Kerns said.
Kerns defined interoperability as a vague term. "[Interoperable] with whom? Or what? I always use a database software package such as Oracle as an example. Is it proprietary? Absolutely. Is it in wide use? Absolutely," he said, concluding that if users wait to implement virtualization, they'll be passed by.
"Virtualization is inevitable. It's going to be just like RAID. Ten to 12 years ago, everyone was arguing about it. You don't even think about it now," he noted. "It's just a matter of getting big name vendors behind it."
But none of the big name vendors have all-encompasing virtualization solutions targeted at the enterprise data center market, including IBM, Compaq, HP, EMC and Sun. It's mid-range vendors like DataCore and FalconStor that have seen success with a combined 1,100 licenses for virtualization sold, according to Kerns.
"This is not small change, but they're not high profile," Kerns said, explaining that until virtualization solutions are introduced in the enterprise data center market, these users will remain skeptical.
"[Enterprise data center users] want to buy from big names, they want to buy direct and they want to know that the vendor will be around in five years to support the solution," he said.
In a recent SearchStorage survey 43% of users said virtualization could work but it's just not there yet. Meanwhile, 37% said it's like fuel injection for storage systems and another 10% said virtualization is interesting, but too confusing.
Users who responded to the survey didn't argue the benefits of virtualization. They noted that the technology is not new -- virtualization has been available at the hardware level for nearly 30 years -- and during that time it worked to "slice and dice" physical resources and provide such solutions as security, data access and replication.
Today, these users are asking for vendors to stop hyping the parts of virtualization in the fabric, and start offering tested and proven solutions that can be scaled across vendor offerings.
"Virtualization solutions need to be robust and scalable beyond a single storage array, fabric or appliance, low latency, and available with no single points of failure," Goins said, adding that, "Many virtualization solutions today just don't fit that list."
In the meantime, they'll have to watch for new solutions, monitor the progress and when the top vendors prove that those solutions are viable, they'll be more likely to embrace the technology, according to Kerns.
Kerns noted that users can expect to see network virtualization solutions from big name vendors in the works, if not deployed, within the next one to two years.