Feature

Big files create big backup issues

Ezine

This article can also be found in the Premium Editorial Download "Storage magazine: How to plan for a disaster before a software upgrade."

Download it now to read this article plus other related content.

Click here

Requires Free Membership to View

for a sampling of
quick fixes for large backups (PDF).

Big files? No problem
Large files are easy to scan, but sending them to tape can take a great deal of time. Many options exist to speed the streaming of a large file to tape (see "Quick fixes for large backups," above), but some applications allow more intelligent and integrated backups. Agents can be used to extract native records from large databases, enabling incremental and online backups. These often have the added benefit of enhancing the usefulness of restored data, as smaller logical pieces can be recovered. For example, individual mailboxes or messages can be extracted using a native software agent, rather than attempting to back up the entire message store every day.

Ford Motor Co.'s Woods used Oracle Corp.'s Recovery Manager (RMAN) technology to allow the database to interact directly with the backup system. RMAN enables many advanced features, including multiplexing of backup data streams, encryption, compression, integration with snapshots and the ability to "freeze" database activity for a consistent copy. Woods streams this data to disk for maximum performance, copying it to tape later as needed.

There's no quick fix to the problem of large backups, but there are many effective approaches. If you have a large number of infrequently changed files, consider splitting the backup job to speed the scanning process. If these files need to get to tape, use a snapshot to gain extra time (see "Use snapshots," below). And if a few large files clog up the queue, check to see if there's an agent that can accelerate the process. Big backups need not jeopardize your data protection.


Use snapshots
When faced with the challenge of massive backups, not everyone agrees that traditional backup methods are the right approach. "Skip the traditional backup to tape, since these take forever to complete, especially with file systems with millions of files," suggests Edwinder Singh, business manager, Datacentre Solutions Group at Datacraft Asia of Singapore. "Use a snapshot or clone instead." This approach is gaining favor among those users with backup challenges. Snapshots are quick and have little or no impact on clients, as they usually leverage the resources of the storage array. "Snapshots take up less space than clones or mirrors, since they use pointers to the production volume," explains Singh. So a storage system can retain a large number of snapshots for point-in-time reference.

But there are some negative aspects to this approach. Snapshots lack the file catalog commonly found in backup applications and restoring files is a manual process, which makes recovery much more difficult. Singh also points out that "accessing your snapshots will affect the production volumes, as they both refer to the same set of data." This can be mitigated, but "losing your primary data in a disk crash will render your snapshots useless," he says, just when backups are needed most.


This was first published in May 2008

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: