Problem solve Get help with specific problems with your technologies, process and projects.

Keep your backups within your window

This tip discusses how pruning your data, tailoring your backups to network patterns and checking the logs of network activity can allow you to do more with less and deal with a shrinking backup window.

What you will learn from this tip: This tip discusses how pruning your data, tailoring your backups to network patterns and checking the logs of network activity can allow you to do more with less and deal with a shrinking backup window.

As data to be protected grows, backup windows shrink -- and the storage administrator is caught in the middle....

In fact, the 'backup window' in the classic sense doesn't exist anymore in most places. There is simply no time when the computer system is not handling jobs for somebody; instead, there are only periods of higher and lower utilization.

Given this situation, the storage administrator's instinct is probably to ask for more resources; faster connections, higher capacity tape libraries, a new disk-to-disk-to-tape (D2D2T) system or a continuous data protection (CDP) appliance. That may be the right answer, but it in today's do-more-with-less climate it's going to take some hard selling.

Related information

Shorten your backup window

Disk-based device helps firm cut backup window in half

Best practices: Optimizing your backups

Prune your data
One of the fastest and easiest ways to cut your backup time is to back up less data. Fortunately, this is pretty straightforward, especially in today's differentiated backup environment. For example, carefully consider whether or not to back up tmp files, especially if you're using a journaling file system or Microsoft's System Restore function. Currently, many backup packages, such as Veritas' Backup Exec., include policy engines that give storage administrators very fine control over what kinds of files are backed up. Taking time to study your software's policy management features and analyzing what you're storing can pay big dividends in reducing backup times.

Check your logs
This is another time when your system's logs are your friends. By checking the logs of network activity, processor use and other relevant factors, you can find bottlenecks and hot spots that are slowing down your backups. What you're looking for here isn't your network's (or SAN's) usage patterns (see next point). Instead, you're looking for inefficiencies that can be eliminated to make the process run faster.

Tailor your backups to your network patterns
Unless you have a dedicated network for backup, backup is always competing for network bandwidth. The more of the bandwidth the backup can use, the less time it will consume. Examine your network logs to determine your organization's usage patterns on both a weekly and monthly basis and decide how best to fit your backups around competing uses. Don't forget to allow for quarterly, annual or semi-annual events, such as the end of accounting periods, which can require extra resources.

Finally, consider a whole new backup architecture
The problem with all these methods is that ultimately they are self-limiting. They can help, but ultimate capacity for backup is ultimate capacity for backup, and sometimes you just need to increase it.

In the modern storage world, there really isn't any such thing as not enough time for backup, provided of course you're willing to pay for it.

With technologies such as D2D2T backup, mirroring and CDP, you can back up almost anything in almost no time at all. The catch is that it's expensive. It usually requires new hardware, a lot of disk space and perhaps a new SAN devoted to backup alone.

At worst, the advantage of considering the alternatives to a new architecture is that it will give you a solid basis of information when you request funding for the new system.

Do you know...

How to keep off-site storage "in-house?"

How to speed up your backups?

About the author: Rick Cook has been writing about mass storage since the days when the term meant an 80 K floppy disk. The computers he learned on used ferrite cores and magnetic drums. For the last 20 years, he has been a freelance writer specializing in storage and other computer issues.

This was last published in March 2006

Dig Deeper on Data storage strategy

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.