There are a number of reasons for mirroring data, including protection against disk failure, handling planned outages,...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
disaster recovery from a remote location and improving local access to data. There are also a number of mirroring alternatives such as remote mirroring, synchronous and asynchronous mirroring. Which strategy is best for a particular job depends critically on the reasons for mirroring and the economics of the installation.
The characteristics of the application can have an impact on mirroring strategies as well. Synchronous mirroring, in which both copies of the data are updated before the write is confirmed to the application, keeps the data sets in synch, but it is usually more expensive and can degrade performance. Asynchronous mirroring, in which the second data set may be cached, can save on network costs, but it doesn't guarantee data set coherency in the event of a problem. Costs aside, performance is one of the key factors in choosing between synchronous and asynchronous mirroring. The question here is "how fast is fast enough?" For a batch-oriented application a delay of several seconds may be acceptable or even unnoticeable; for a transaction-based system or a highly interactive application such as CAD, even a second may be too long.
Economics becomes especially important when considering mirroring for non-disaster-related reasons. For example if you are considering mirroring a data set to improve performance you have to balance the cost of mirroring against the cost of other solutions, such as more powerful servers and bigger data pipes.
With a remotely mirrored site, especially an asynchronous one, it may be worthwhile to distribute the intelligence of the mirroring system by putting a server or similar device at the remote site. This makes it easier to guarantee the integrity of the data and to do a failover if necessary, at the expense of some additional cost and possibly introducing more latency into the system.
Distance between mirrored data sets has a critical influence on the design (and hence the costs) of mirroring. In its white paper on mirroring, DataCore Software suggests that if the distance is less than 500 meters, the native Fibre Channel links will suffice. Up to 10 kilometers, transmission can be handled by using long wave fiber between Fibre Channel switches. Up to 100 kilometers can be covered by using optical extenders including Dense Wave Division Multiplexers. Beyond 100 kilometers, DataCore says, synchronous mirroring becomes difficult because of latencies introduced by the transmission system. Distances beyond the campus can also introduce right-of-way issues. There is an economic consideration as well Generally speaking, each of these solutions is more expensive than the last.
Rick Cook has been writing about mass storage since the days when the term meant an 80K floppy disk. The computers he learned on used ferrite cores and magnetic drums. For the last twenty years he has been a freelance writer specializing in storage and other computer issues.
Dig Deeper on Data management tools