Whether they are for e-discovery, compliance or storage management, data archiving projects are high priorities...
for storage managers these days. For most storage managers, the process of picking an archiving product begins with two questions: whether to outsource, and how to best m manage power and cooling requirements of large archives.
SearchStorage.com spoke with archiving customers this month about these issues and how they arrived at their archive purchasing decisions.
On-premise vs. hosted archives
"I'm a one-man show for IT," said Tony Lux, manager of technology for the Boulevard Brewing Company in Kansas City, Mo. Lux began archiving about 30 GB of email using MessageOne's Storage as a Service (SaaS) in 2006 to free up space on his production Exchange server. Now that the Federal Rules of Civil Procedure include a safe harbor for anyone with a cohesive data retention policy, outsourcing has another benefit.
But some storage managers find it easier to roll out their own archives, rather than use a managed service. "We looked at hosted services," said Rick Chin, senior vice president of information technology for privately held Pinnacle Financial Corp. Chin manages 7.5 TB of total storage for 1,600 employees at 100 branch offices in several time zones. He briefly evaluated services from Autonomy-Zantaz but chose to deploy in-house archiving software from Mimosa Systems Inc.
Chin already had his storage centralized at a Florida data center with a secondary disaster recovery site for storing replicated archive data, and Pinnacle had just added a new NetApp FAS3020 filer with disk space to support an on-premise archive. "With all of that already in place," he said, "a one-time acquisition cost for Mimosa made far more sense than an ongoing monthly subscription to a service."
Other storage managers are wary of trusting data to a third party, regardless of how much storage they manage. "I don't know you, I don't trust you, I don't know how long you'll be in business," was Carl Purdy's message to hosted archivers, when the Jefferson County, Mont., IT administrator was evaluating products for his 50 GB environment. "You could pull the plug on me tomorrow, and my only recourse would be to move on." Jefferson County is currently running Forensic & Compliance Systems' Cryoserver email archiving appliance.
Data center power, cooling and capacity decisions
Storage managers who choose on-site archives, especially inside large organizations, will have to contend with supporting large long-term data stores at a time when there are power and cooling crunches in many data centers. The need to deal with power issues increases tape's value as an archiving medium and new software tools can also help save energy and space.
"We have 750 TB of capacity in our archive," said Jason Williamson, senior vice president of systems development for Elektrofilm, which has about 200 TB of archival data written to Sun's SL3000 tape library. According to Williamson, the amount of data in his environment makes storing it all on disk impractical. "The cost of tape media is about 30 cents per GB, while with a SAN, the total cost can add up to tens of dollars a gigabyte," he said. Sun's SAM-FS software helps Williamson improve data access time on tape by keeping track of the data on each cartridge and automatically retrieving it from the tape library when it's needed.
Sun Microsystems Inc. recently packaged products similar to those used by Williamson , calling it the CIS Infinite Store Archive System. Index Engines Inc.'s Automated Tape Extraction Module, which became available in February, is another tool to classify and retrieve data on tapes for e-discovery purposes.
Other storage managers said data deduplication products that integrate with their archiving software are the best way to lower costs while archiving large amounts of data on disk.
Kevin Fitzpatrick, IT director for Roel Construction, said he first deployed Data Domain's DD460 data deduplication device for disk-based backup. But he now stores archival project data on a partition of the box. Because Data Domain offers a standard NAS interface, Fitzpatrick said administrators just drag and drop data folders onto the archive partition when necessary. So far, the company has 1.5 TB of logical data on about 300 GB of physical disk and expects archival data to build up steadily.
"Right now, we're using around 300 GB to store 1.5 TB, so that's a quarter of what we'd normally spend on capacity," Fitzpatrick said. "We expect will have about 15 TB of data after three years, and if we can do it at a quarter of the price, we can ditch tape."