News Stay informed about the latest enterprise technology news and product updates.

Deduping slows but doesn’t stop data growth

While most organizations are likely planning to trim their IT budgets next year, a lot of them are also no doubt finding that cutting storage capacity will be difficult if not impossible.

Take Victaulic Company. The pipe joining manufacturing company purchased 30 TB of usable capacity with its new Sepaton S2100-ES2 VTL in September, and infrastructure manager Fred Railing says he’s already ordered 10 TB more because of an increase in data being backed up. And that’s with a 39-1 deduplication ratio from Sepaton’s DeltaStor software.

“[The VTL] was sized appropriately when we bought it, but the amount of data we have to back up has increased 30 percent already,” Railing says. “We keep six weeks worth of backup, and we’re close to capacity right now. We have to keep an eye on that to make sure it doesn’t fill up.”

Railing says the increase in data stored hasn’t come from an acquisition or any unusual situation, and he doesn’t see it slowing down much soon.

“We have quite a few projects going on now, and we’ve added lot more servers in the last nine months,” he said. “It always will increase, maybe not as dramatically as it has been lately, but our engineering and email data keeps growing and giving us more to back up.”

Victaulic dedupes everything it backs up, Railing says, although reducing data wasn’t the original reason for going to the VTL. He set out to reduce backup windows by using disk, and his backups have gone from 24 hours to 12.

“At first we were looking at just getting a VTL to shrink our backup windows,” he said. “We thought it was worth getting dedupe option because we would end up buying so much more disk. Now we’re deduping everything we back up.”

Join the conversation

1 comment

Send me notifications when other members comment.

Please create a username to comment.

Hi Dave – I agree that cutting storage capacity is critical, but it is not impossible and only difficult if you don’t know what you are looking for. De-dup is an excellent tool and must be used to accomplish this goal (IMHO) but dedup only goes so far. There can be much more wasted capacity reclaimed in other areas. For example, over-allocated database tables and apps, orphaned data and space that file systems don’t pick up or see; data that is inert or inappropriate – that doesn’t need to be sitting on primary storage in the first place. Space that is allocated for whatever reason, but data or files are not stored there. We have personally been able to reclaim anywhere between 30% to 50% space on primary storage systems by fixing allocation problems, cleaning up inappropriate data, moving inert data to lower storage tiers, AND then de-duping duplicate files. It has to be a comprehensive, holistic approach – opposed to buying a single piece of software and/or hardware. And the solution has to include “people processes” in addition to technology approaches. But it can be done – and if done right, you can extend the life of your current infrastructure for years to come. Some may want users to think it s impossible (w/o their appliance of course!) – but it’s not…