This article can also be found in the Premium Editorial Download "Storage magazine: What you need to know about all solid-state arrays."

Download it now to read this article plus other related content.

While there are scenarios where deduplication should be driven by the storage (1.0) or media server (1.5), an ideal data deduplication environment for most environments would have the changed data going directly from the production server to storage (2.0), with the backup server the scheduler and keeper of the catalog and metadata only. This level of deduplication maturity doesn’t necessarily require hardware, although software is required.

ESG’s latest research on data protection modernization reveals how dedupe users are delivering or planning to deliver deduplication within their data protection products. Among current dedupe users, 46% use a software-only approach, 28% apply hardware-only methods and 26% employ a combination of the two.

• A software-only approach might involve Symantec NetBackup 7.5, for example, with its client-side deduplication and accelerator features. Symantec Backup Exec, CommVault Simpana and Quest (now part of Dell) NetVault represent similar software-centric approaches.

• A hardware-only approach might involve any deduplication array in which enablement software is turned off (or not available) within the backup server, and the backup server is unaware of the deduplication capabilities of the storage.

• A hardware plus software approach might be something along the lines of the EMC Data Domain products, with Data Domain Boost at work either in the backup server or in the production server

Requires Free Membership to View

via EMC NetWorker or even Oracle Recovery Manager (RMAN). Similar functionality is being touted by Hewlett-Packard (HP) through its recently announced HP StoreOnce and Catalyst enablement APIs.

Interestingly, IT respondents who aren’t yet using deduplication (but plan to deploy) have a different strategy. In those cases, only 19% plan to use a software-only approach (down from 46% of current users). These respondents have a much higher anticipated use of hardware-centric or hardware plus software products.

If you haven’t committed to data deduplication, do that first. Next, consider where the deduplication will occur. When talking to vendors, get them to pinpoint their published ingest rate for data during backups, as well as their restore rate (which may be very different).

Then, because every vendor calls itself the “most innovative next-generation leader in deduplication,” test it -- not with a read-through of their whitepapers, but with a representative sampling of your data.

BIO: Jason Buffington is a senior analyst with Enterprise Strategy Group. He focuses primarily on data protection, Windows Server infrastructure, management and virtualization. He blogs at CentralizedBackup.com and tweets as @JBuff.

This was first published in August 2012

There are Comments. Add yours.

TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: