Instead of buying an enterprise-grade storage-area network (SAN), the Sundance Institute chose a StorSimple public cloud storage gateway to help keep up with rapid data growth generated during an 11-day period every year.
Sundance turned to a StorSimple iSCSI cloud appliance in mid-2011 instead of adding a separate SAN for primary storage, archiving and backup. The organization’s storage is mostly high-definition video files and photos, and its capacity increases as much as 30% every year during the 11-day Sundance Film Festival.
Justin Simmons, associate director of technology services, said the Sundance Institute is bracing for a significant boost in data when the festival opens Jan. 19. He said the non-profit has approximately 25 TB of video files, 5 TB of photos, and 2 TB of data of Microsoft SharePoint files and documents. He expects that to grow by another 10 TB during the upcoming Sundance Film Festival and hit a total of 60 TB within the next year.
StorSimple allows me to come up with a hybrid model. I can do compute in my own data center, but store and archive in the cloud.
Justin Simmons, Associate Director of IT at Sundance Institute
Last April, not long after the 2011 festival, the Sundance IT team considered purchasing a new SAN to keep up with data growth. They also considered moving their entire compute and storage operation to the cloud.
“We had rapidly expanding storage, and a lot of this was an expansion of unstructured data around video and photos,” Simmons said. “We started collecting high-definition video all over the place from people taping panels and interviews during the festival.”
He said the most immediate problem was backup. Sundance was backing up onto disk with Symantec Backup Exec, but couldn’t fit in a daily backup in 24 hours. The organization’s storage administrator was constantly juggling backups to its direct-attached storage (DAS) and Hewlett-Packard LeftHand iSCSI SANs.
“Backup was fine for day-to-day working documents, but it was hard to back up video,” Simmons said. “We were playing a storage shell game, moving backups around to fit everything in.”
Sundance had approximately 5.5 TB of storage on three HP LeftHand boxes with high-performance 15,000 rpm drives. It still uses those systems to store data from Citrix XenServer, Microsoft Cluster Server and VMware applications, but Simmons said it would have cost too much to use that system for all of its storage. He estimates that, with RAID, he would need 150 TB on the LeftHand boxes for his video storage within the next year.
“We needed something that would scale,” he said. “I anticipated that we’d be close to doubling the size of our SAN every year. The cloud started looking like a good way to go. We looked at Amazon to see how we can get data in and out of there. We would have to write an API, use the file and pull it down through cloud storage providers. Then I heard about storage gateways that would let me put files in a public cloud that are encrypted and deduplicated, but look like private storage to our servers.”
Simmons said StorSimple rival Nasuni was “an interesting option” because it offered software as a virtual appliance. But Nasuni is network-attached storage (NAS)-only and, at that point in time, didn’t sell its software installed on an appliance (Nasuni now sells both appliance and software-only options). The software-only setup required using on-premises disk for cache, and he didn’t have any disk to use. Simmons also wanted block storage capabilities and the performance from solid-state drives (SSDs) in StorSimple appliances.
After testing a StorSimple appliance last summer, Sundance purchased a gateway with 14 TB of capacity. StorSimple provides the Sundance Institute with three tiers. Simmons and his team uses SSDs for highest performance, 2 TB SAS drives for high capacity, and the cloud for archiving and backup. “We have day-to-day data on solid-state drives,” Simmons said. “Those are the files that people are actively working on.
“StorSimple allows me to come up with a hybrid model," he added. "I can do compute in my own data center, but store and archive in the cloud.”
Simmons said the cloud solved his backup problem. He uses StorSimple’s Cloud Snapshots to copy data blocks and send them off encrypted to the cloud. The Sundance Institute still uses Backup Exec for SharePoint, SQL and Exchange data to take advantage of its application-specific agents, but is moving the high-capacity files to the cloud.
Simmons said Sundance uses Microsoft Windows Azure as its cloud provider, but he’s keeping his options open. He may use different providers for archive and day-to-day files, depending on price and service levels.
“I want to shop around,” he said. “I have the ability to use the best provider at the best price.”
Tiering to the cloud
The Sundance Institute is just starting to take advantage of the cloud tier. Simmons said he hasn’t started uploading video files yet, but the upcoming festival will require more capacity than he has on the appliances from StorSimple. “As we reach our [on-premises] capacity, we’ll start tiering to the cloud,” he said. “Most of that will go to the cloud provider, and we’ll snap it so we have two copies. We’ll have no copies of video files on local storage.”
Sundance will use StorSimple’s policies to automatically move older data to the cloud. “It will take some training for our video team,” Simmons explained. “When you go to access a file and it’s on the cloud, it will take awhile. But it’s rare when they will have to access an uncompressed raw version of video file.”
He said restores take a little longer from the cloud, mainly because there are more steps to the process. “We’re getting people trained to take a Cloud Snapshot, mount it to a server and pull the file off it,” Simmons said.
He said his team recently restored a large PST file in minutes -- approximately five times what it would’ve taken to restore from local storage, “but it’s worth it because of all the time we were spending on backups.”
Simmons said he expects to eventually add another StorSimple gateway to the Sundance Institute’s Los Angeles office. Now the organization’s Los Angeles staff has to access data over the local-area network (LAN) from the gateway in the Salt Lake City data center.