Problem solve Get help with specific problems with your technologies, process and projects.

Replicating large amounts of data

I read your explanation on H/W vs. S/W replication techniques. In replicating large amounts of data over databases,...

have you experienced any problems?

Sure... there can be many issues when doing remote copy for DR.

One of the basic problems that customers face is bandwidth and link reliability. Over time, you will be adding capacity to your production site and if the link between the local and remote sites was not spec'ed out with growth in mind, then you can hit a bottleneck during replication. This dramatically affects Sync copy to a DR site. Your production apps will slow to a crawl, and the remote copy may also suspend itself due to timeout conditions existing on the link.

A good rule of thumb to use for defining you connection requirements is 10Mbit of network bandwidth for every 1MByte of write I/O per second. Determine your WIO during peak periods. This is important, as many folks set requirements for the "average" load. During end of month, end of year, or batch processing, your WIO may increase dramatically and flood the links. You can determine average WIO per day by looking at your incremental backup logs, and get the total MB backed up. The best way to do this though is to use things like NT PERFMON logs. Set up a log file for all disk I/O over a week period during month end processing. (On NT, remember to use the diskperf -Y command in a DOS prompt and reboot to enable disk statistics.)

Also, if during any time of day you do a DBCC (database consistency check) on your databases, it will touch every block in the database. This is true for things like Exchange utilities too. Before running these utilities, it's best to suspend the remote copy process for those volumes. This can be automated on some disk arrays using vendor supplied automation scripts for backup and databases.

Software remote copy has the same issues. If doing a DBCC on Oracle, while running host base replication, CPU usage and network usage will shoot up. Keeping these things in mind, and having an expert help you with all the nuances of the remote copy process, can help eliminate most of these things from ever occurring.


Editor's note: To view Chris' answer to the HW-SW replication techniques referred to in the above question go to:

H/W-S/W replication techniques, Part I https://searchdisasterrecovery.techtarget.com/answer/H-W-S-W-replication-techniques-Part-I

H/W-S/W replication techniques, Part II https://searchstorage.techtarget.com/answer/H-W-S-W-replication-techniques-Part-II

H/W-S/W replication techniques, Part III https://searchstorage.techtarget.com/answer/H-W-S-W-replication-techniques-Part-III

Editor's note: Do you agree with this expert's response? If you have more to share, post it in our Storage Networking discussion forum at http://searchstorage.discussions.techtarget.com/WebX?50@@.ee83ce4.

Dig Deeper on Big data storage

Have a question for an expert?

Please add a title for your question

Get answers from a TechTarget expert on whatever's puzzling you.

You will be able to add details on the next page.

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.