BACKGROUND IMAGE: iSTOCK/GETTY IMAGES
Companies scrambling to get their data off storage clouds hosted by failed provider Nirvanix Inc. find varying challenges depending on what type of setup they used.
Nirvanix has extended its deadline to remove data from its cloud until Oct. 15. Updates on the company's website disclose that Nirvanix has filed for Chapter 11 bankruptcy and will use its remaining resources to help return customer data or transfer it to other clouds, such as Amazon Simple Storage Service (S3), Google Cloud Storage, IBM SoftLayer or Microsoft Windows Azure.
Nirvanix is prompting customers to contact IBM for help. IBM had an OEM agreement to sell Nirvanix storage cloud services.
Nirvanix storage customers face different challenges, depending on what type of controller they used as a transport for data into the cloud; whether they used a private, public or hybrid cloud; and their use case.
Companies that used Nirvanix public cloud for archiving face the greatest challenge, especially if the data falls under compliance rules. Cloud storage backup customers can redirect their backups to another cloud provider's repository and allow the older backups to just fall off. Customers who used the service for tertiary data can just walk away, since they have a primary and secondary copy elsewhere.
"The worst-case scenario is if Nirvanix held one copy, which means you have to move all that data. Those in that situation are pretty much screwed," said Henry Baltazar, a senior analyst for infrastructure and operations professionals at Cambridge, Mass.-based Forrester Research. "You're better off if you followed best enterprise practices and you didn't have all your eggs in one basket."
On the other hand, Baltazar said, "I talked with one customer who said the transition was not hard. They just started writing backups to another cloud or target. There's a different level of importance for each application, so customers are prioritizing."
Nirvanix also offered customers a cloud file storage node called an hNode that resided within a customer's data centers and was used as an on-ramp to the Nirvanix public cloud in a hybrid setup. Customers could federate multiple hNodes in geographically dispersed data centers, with a global namespace and policy-based replication between them.
Baltazar said customers who implemented hNode only in a private cloud should be in good shape, but they could face problems if they used it with the public cloud.
Nirvanix private cloud customers are in the best position because they have the data on Nirvanix equipment in their own data centers.
"Nirvanix is talking directly to private cloud customers. They can just take Nirvanix systems with them," said Steve Ampleford, CEO of U.K.-based cloud provider Aorta Cloud, which is offering services to customers to relocate data from the Nirvanix storage cloud.
Customers who wrote applications based on Nirvanix's proprietary application programming interface (API) calls must rewrite the code for those applications. Ampleford said they can get the data out of the Nirvanix systems, but customers will have to rewrite the APIs so that put, get and delete commands can be done in another cloud.
"If you wrote to Nirvanix's APIs, that fundamentally will not work in a new cloud," he said.
Ampleford said that most of the Nirvanix customers who have reached out to Aorta for assistance used Panzura and TwinStrata controllers, and Symantec backup software to transport and store data in the cloud. Not all customers who used Nirvanix cloud storage for backup are home free, he said.
Backup applications such as Symantec NetBackup weren't written to pull all the backups at once. The software is built to restore from disk or tape, not to move backups from cloud to cloud.
"I have to believe there are a set of customers who aren't willing to let their old backups go," said Connor Fee, director of marketing for Nasuni, a cloud network-attached storage controller vendor that uses Amazon S3 and Microsoft Azure as providers. "If they're willing to let it die, they weren't storing important data. The backup software was designed to restore one backup at a time, not move many backups from one storage location to another."
Controller vendors are playing a crucial role in this cloud crisis, particularly Panzura and TwinStrata. Panzura provides a high-performance system, the Quicksilver Cloud Storage Controller, in a virtual or physical form factor. The TwinStrata CloudArray controller also lets customers store data locally and send it off to service providers in a hybrid cloud storage setup. Both systems provide local caching.
TwinStrata CEO Nicos Vekiarides said most of his customers have data "ranging from 1 terabyte to over 100 TB" in the public cloud. None had a petabyte (PB) or more in the public cloud.
"The urgency has not been as high for private cloud customers," Vekiarides said. "Customers that have a local copy on our system can reconnect the copy to a new cloud provider. That's a simple operation. [For] those who have large amounts of data not in cache, we can do a cloud-to-cloud migration. We can shut down the gateway on the customer's premises, set one up in another cloud provider and start migrating to another cloud."
Panzura CEO Randy Chou said his company has "more than 10 and less than 100" customers using the Quicksilver controller for the public cloud.
"Every one of our customers is in a good situation," Chou said. "Every one of our customers that has a petabyte or more is in a private cloud, including Cerner and University of Southern California [USC]. Most of our customers are Fortune 500, and in general they have a lot of bandwidth. Plus, one Panzura appliance can pull data in parallel. Multiple copies can be pulled at the same time."
USC contracted with Nirvanix in 2011 to place 8.5 PB of digital archives in a Nirvanix-hosted private cloud on campus. Sam Gustman, associate dean at USC Libraries and executive director of the USC Digital Repository, declined to discuss what he's doing to cope with the Nirvanix shutdown.
Aorta Cloud's Ampleford said Panzura customers face a big challenge because they're often enterprise-level companies with petabytes of data in the cloud. Ampleford said the Panzura customers that have come to them for assistance have data in the public cloud.
"It's the data structure," he said. "They embed metadata in the path of the file, so consequently, you have a more complicated structure. We point Panzura to the Aorta Cloud. As part of the upload in the Aorta Cloud, we need to change the data structure so the Panzura device can recognize data in the new target."