Magazine

Is traditional backup past its prime?

Ezine

This article can also be found in the Premium Editorial Download "Storage magazine: Relief for virtual server backup."

Download it now to read this article plus other related content.

We’ve been backing up our data the same way for decades, but proliferating applications and massive amounts of data are forcing a change.

For a long time now, people have been predicting the death of backup, but with Rasputin-like tenacity, it just won’t die. Stab it, poison it, shoot it and then drown it -- it’s still a dominant and pervasive technology throughout the known universe. And while some may want to believe that backup has one foot in the grave and the other on a banana peel, the rumors of the death of backup have been greatly exaggerated. All the ingredients to kill it off exist right now, but IT tends to move at a leisurely pace, so it will take years for those elements to become mainstream.

Backup is still one of the areas of IT that’s the least innovative, causes some of the biggest issues, and costs users time and money. The most exciting thing to happen to backup in a long time was Data Domain, which attacked the universal challenges created by using tape as a backup medium. Tape is unreliable and hard to manage; recovery processes can be complex, inefficient, cumbersome and prone to error . . . the list goes on. But the one thing tape has going for it is that it’s cheap. Who doesn’t want an inexpensive insurance policy? It’s only when something goes wrong that you care about how good your insurance policy is. And if that hardly ever (or never) happens, life is good.

Data Domain changed the economics

    Requires Free Membership to View

of backing up to disk with dedupe, and the rest is history. What they ultimately do is provide much better “insurance” that’s economically compelling. It’s a no-brainer.

But whether or not anything exciting is going on with backup, you still have to back stuff up. You still require software and agents, as well as server, network and storage resources. And while data continues to grow as databases get bigger and file systems become more massive, you still need to have small backup data sets because who wants to recover a 10 TB, 100 TB, 500 TB or 1 PB data set? Just guessing, but I’d say nobody does.

Data storage vendors love to talk about limitless file systems and massive object-based storage systems, but how do you back up that kind of stuff? If you have a large file system, you’re most likely not backing it up, but replicating it. However, finding and recovering data at a granular level is what’s lacking in using replication. If you replicate at the storage system level, you’re using block-based technology. If you use host-based software, then you eat up so much of the server’s resources it becomes impractical and even impossible to replicate large amounts of data with that method.

Users with smaller environments can continue to back up using the same traditional processes, but what happens when 100 TB increases to 200 TB? When 500 TB turns into 1 PB? I’m working with companies that have petabytes of data in their environments, and the rate of growth is staggering. The unbridled growth of storage will soon require another approach because legacy backup and recovery software won’t cut it anymore.

A vast majority of users don’t have massive file systems or gargantuan databases, but rather lots and lots of smaller instances of each. There are often hundreds and even thousands of discrete backup jobs creating operational challenges, multiplying the risk of data loss and racking up huge costs.

Server virtualization is also pushing the limits of backup. Even users with smaller environments may have dozens of applications they have to manage. With virtualization, it’s not just the amount of data, but the number of apps you have to protect.

Overall, backup is still the de facto method of protecting data and will be for at least the next decade. Having said that, I think the beginning of the end is upon us. Backup processes are being replaced incrementally. Replication and recovery will be used more often as will application-based data protection tools. These two points seem contradictory, but it’s often the way within IT that both the status quo and the next-generation approach coexist for long periods of time.

BIO: Tony Asaro is senior analyst and founder of Voices of IT.

This was first published in July 2011

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: