I read your explanation on H/W vs. S/W replication techniques. In replicating large amounts of data over databases,...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
have you experienced any problems?
Sure... there can be many issues when doing remote copy for DR.
One of the basic problems that customers face is bandwidth and link reliability. Over time, you will be adding capacity to your production site and if the link between the local and remote sites was not spec'ed out with growth in mind, then you can hit a bottleneck during replication. This dramatically affects Sync copy to a DR site. Your production apps will slow to a crawl, and the remote copy may also suspend itself due to timeout conditions existing on the link.
A good rule of thumb to use for defining you connection requirements is 10Mbit of network bandwidth for every 1MByte of write I/O per second. Determine your WIO during peak periods. This is important, as many folks set requirements for the "average" load. During end of month, end of year, or batch processing, your WIO may increase dramatically and flood the links. You can determine average WIO per day by looking at your incremental backup logs, and get the total MB backed up. The best way to do this though is to use things like NT PERFMON logs. Set up a log file for all disk I/O over a week period during month end processing. (On NT, remember to use the diskperf -Y command in a DOS prompt and reboot to enable disk statistics.)
Also, if during any time of day you do a DBCC (database consistency check) on your databases, it will touch every block in the database. This is true for things like Exchange utilities too. Before running these utilities, it's best to suspend the remote copy process for those volumes. This can be automated on some disk arrays using vendor supplied automation scripts for backup and databases.
Software remote copy has the same issues. If doing a DBCC on Oracle, while running host base replication, CPU usage and network usage will shoot up. Keeping these things in mind, and having an expert help you with all the nuances of the remote copy process, can help eliminate most of these things from ever occurring.
Editor's note: To view Chris' answer to the HW-SW replication techniques referred to in the above question go to:
H/W-S/W replication techniques, Part I http://www.searchStorage.com/ateQuestionNResponse/0,289625,sid5_cid414545_tax286192,00.html
H/W-S/W replication techniques, Part II http://www.searchStorage.com/ateQuestionNResponse/0,289625,sid5_cid414550_tax286192,00.html
H/W-S/W replication techniques, Part III http://www.searchStorage.com/ateQuestionNResponse/0,289625,sid5_cid414554_tax286192,00.html
Editor's note: Do you agree with this expert's response? If you have more to share, post it in our Storage Networking discussion forum at http://searchstorage.discussions.techtarget.com/WebX?50@@.ee83ce4.
Dig Deeper on Big data storage
Related Q&A from Christopher Poelker
RAID can allow for better storage performance and higher availability, and there are many different RAID types. Read a comparison of RAID levels, as ...continue reading
SAN expert Chris Poelker compares connecting a SAN with wavelength cabling and dark fiber and discusses the pros and cons of each.continue reading
SAN expert Chris Poelker discusses how to change the size of a LUN in a Microsoft cluster server environment.continue reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.