If you were tired of hearing about cloud storage in 2010, you'd better buy ear plugs. Storage executives agree that cloud hype and adoption will increase significantly this year, and insist more people are using the cloud than
Storage execs also see technologies that scratched the surface in 2010 such as primary data reduction, solid-state storage and 10 Gigabit Ethernet (10 GbE) SANs playing more prominent roles this year.
"We'll see the same economic conditions and the same major IT themes this year as we saw in 2010," said Val Bercovici, NetApp's cloud czar who leads the strategic planning team in the company's CTO office. "2011 will be a year of solidification and increased adoption of some key trends that began in 2009 and 2010."
Cloud storage: Pick your model
Data storage vendors spent a lot of time talking about the cloud in 2010, trying to convince people how their products are truly cloud-worthy. Expect that to continue in 2011. But storage executives say it's not a question of if the bulk of their customers will use the cloud, but whether they'll adopt a private cloud, public cloud or hybrid cloud model.
"We don't expect the hype to slow down at all," said Steve Wojtowecz, IBM's vice president (VP) of storage software development. "If you're an IT manager, CIO or CTO, unless you're talking about the cloud or how to implement it, you're going to become irrelevant."
Craig Nunes, director of storage marketing at Hewlett-Packard (HP) Co., envisions organizations this year using a hybrid cloud model, farming out some applications to a service provider while keeping others in-house.
"I think 2011 is the year of the hybrid cloud," he said. "By hybrid, I mean a combination of the public cloud and internal private cloud. You might run CRM in a private cloud, but another application in a public cloud. Mainstream organizations will probably operate in this hybrid model. A lot of folks are already wading into hybrid clouds because the economics has proved it's a better way."
The storage executives also maintain that many organizations are already using the cloud without officially sanctioning it. This ad hoc move to the cloud is led by development teams looking to acquire storage quickly.
"There's an undercurrent for a stealth cloud," NetApp's Bercovici said. "Application developers are using the public cloud with separate identities not tied to the corporate identity. They're doing this because of agility and cost. How many stealth clouds become driven through corporations and become legitimate or live outside IT? Developers are power brokers for how the cloud emerges."
"It's remarkable how much cloud adoption mirrors server virtualization," Shaun Walsh, VP of corporate marketing at Emulex Corp., said of the cloud. "It started at the test/dev level, then moved to the edges of the enterprise and we're slowly seeing it start to migrate into the enterprise.
"Developers couldn't get storage allocated to them by IT for three weeks, so they got storage allocated from [Amazon] S3 on credit cards and expensed it," he continued. "I think you'll see that happen more. Remote sites will use the cloud first. From a data center perspective, it's still an experiment. There are still issues with quality of service, data protection, etc. But the model is proving itself and making believers a few people at a time."
Janae Stow Lee, senior vice president at Quantum Corp., agrees and said it will take better methods of moving data into the cloud and managing it there to accelerate adoption.
"SMBs and departments are adopting it as a fast, flexible way to get storage," she said. "I think we're due for a tape-falling-off-the-truck episode in the cloud, where somebody loses data. The system management tools we need are never there the day we get started. It takes us awhile to catch up with the holistic management we need to take advantage."
Data reduction ready for primary storage?
In 2010, vendors set the stage for extending data deduplication and compression from backup to primary data. EMC rolled out inline primary data compression software, Dell Inc. and IBM bought primary data reduction vendors, and Permabit Technology Corp. landed OEM deals with BlueArc Corp. and Xiotech Corp. for its primary dedupe software.
NetApp is the leader in primary deduplication. The vendor claims more than 15,000 customers are running its free primary dedupe software on more than 100,000 systems. NetApp's Bercovici said he expects more customers to turn it on this year for more business-critical applications.
"It's trending almost exactly to how VMware adoption trended," Bercovici said of primary deduplication. "First, it's used on applications that are less critical and more proof of concept. In 2011, we'll see deduplication on more business applications. There's more of a comfort level in primary deduplication in hypervisor scenarios. And we're seeing it on a lot of rich media that's unstructured rather than transaction data like Oracle applications."
But not everyone sees primary dedupe and compression as a panacea for data storage growth. HP launched its StoreOnce deduplication last year for backup data with plans to eventually use the technology for primary data, but HP's Nunes said the vendor is in no rush to offer primary dedupe because it comes with a price.
"I think the polish will wear off this year when people see the performance overhead," Nunes said. "There's a layer of overhead with deduplication on virtual servers, and that story will get tired in 2011. We will not push primary deduplication until we have a real answer."
However, IBM's Wojtowecz said many organizations will turn to data reduction to combat ever increasing data stores. "Right now, it's not used much at all," he said. "But tightening budgets will force customers to implement it."
Solid-state storage: SSD adoption depends on pricing
HP's Nunes said the need for efficiency will bring solid-state drives (SSDs) more into play for storage, but only in small increments. He sees organizations using "a tiny amount of SSD with plenty of SATA. It doesn't take much solid-state to do what people need it to do, with the right software. Customers deploy solid-state for hot data, about 1% or 2% gets moved [with automated tiering]. The real winner is SATA; that will take the lion's share of footprint in the data center."
NetApp's Bercovici said the big push for SSDs will come when the price per gigabyte of SSD gets to approximately two or three times that of spinning disk. "The short-term benefits are a race to the bottom for disk pricing," he said. "Twenty-five percent of all systems now have solid-state shipping with them. Once the price of SSD and disk begin to converge, we'll see some interest in solid-state implementations."
IBM's Wojtowecz said automated storage tiering software such as Dell/Compellent's Data Progression, EMC's FAST, HP 3PAR's Adaptive Optimization and IBM's Easy Tier will spur SSD adoption by allowing customers to "spread data around on SSDs or disk or virtual tape libraries."
10 GbE pushes iSCSI
Organizations have yet to embrace Fibre Channel over Ethernet (FCoE), and even storage vendors admit FCoE probably won't catch on in large numbers for data storage systems in 2011. However, the 10 Gigabit Ethernet (10 GbE) that was seen as the driver of FCoE in storage is being adopted for iSCSI SANs.
Emulex's Walsh said 10 GbE is driving demand for iSCSI, but FCoE hasn't really moved beyond the pilot stage for storage. Financial services firm UBS presented at a recent Emulex analyst day, pointing to FCoE as key technology over the next three to five years. Walsh said he expects to see more FCoE products this year that will fuel FCoE adoption in late 2011-early 2012.
HP's Nunes said he sees a similar adoption pattern for FCoE. "Folks are giving it a fair bit of thought, but it's definitely following behind 10 gig iSCSI in demand," he said. "We're seeing more interest in 10 GigE from an iSCSI perspective than from an FCoE perspective."