The traditional role of data replication has been to support disaster recovery plans, but that may be about to change.
In recent conversations with customers, we discovered that data replication as a technology has taken on a number of dimensions -- as part of enterprise-wide storage strategies that deliver on Recovery Time Objects, as a tool for data retention, and to assist in storage consolidation and migration. It is likely storage vendors will also position their data replication products differently in the coming years to support evolving market segments such as ILM, which appear to be much more holistic than previous approaches. That means that data replication will be viewed more as part of a continuum of data protection technologies and less as just a standalone tool.
By next year, customers making buying decisions should ask vendors what their integration plans will be with broader data protection strategies. In the meantime, there are a number of issues customers need to consider when evaluating these kinds of tools.
Delivery mechanism. Most customers already understand the various options they have in selecting data replication tools. Until now, storage system vendors have offered data replication as an option, but a growing number of vendors are beginning to offer network-based replication inside devices such as routers, switches or dedicated appliances. Determining which option is best depends on your existing environment, your budget and key features you are looking for in the replication platform. Subsystem-based data replication has had a reputation for being the most expensive approach, but this is gradually changing as more vendors offer midrange replication tools tied to their midrange platforms. Network-based replication platforms are very early in customer deployments, with only a few vendors offering products here.
Key considerations. Customers need to consider the following functionality as part of their evaluation criteria:
- Performance: How well does the replication product deal with performance issues? Does it employ compression or throttling of data?
- Availability: How much redundancy does the tool have if a failure occurs at the target? Are there retry mechanism to ensure packet delivery?
- Application and file system awareness: Does the tool have an awareness or special support of files, databases and other applications?
- Ease of use: How easy is it to deploy and manage? Is there a centralized management console? Does it interact with other management tools? Where are replication failures/problems reported? Does this allow for one-to-many and many-to-one configurations?
- Price: How do these investments meet ROI expectations for better business operations availability, improved network and storage system efficiency and reduced labor costs?
- Heterogeneity: Is this tool heterogeneous in approach? There are trade-offs depending on customer requirements for availability and performance.
For more information:
About the author: Jamie Gruener is a SearchStorage.com expert and the primary analyst focused on the server and storage markets for the Yankee Group, an industry analyst firm in Boston, Mass. Jamie's coverage area includes storage management, storage best practices, storage systems, storage networking and server technologies. Ask him your storage management questions today.