Bridging the gap: Choosing storage-over-distance network technology

Disaster recovery and remote backup programs rely on an efficient, cost-effective WAN. Learn which transport is best and the related implementation issues.

What you will learn from this tip: Many disaster recovery and remote backup programs rely on an efficient, cost-effective WAN. Fiber-optic network technology is often required for long-distance data transmission, but you need to know what transport is best and the related implementation issues.

High-speed optical WANs are becoming indispensable for critical storage applications. Business continuance, remote mirroring and replication, and connecting regional data centers are all tasks that require optical WANs. With many optical options and prices, picking the best technologies and techniques to transport data is challenging.

Most storage apps are time-sensitive and require high throughput (bandwidth) and low latency with zero data loss. Effective bandwidth is a measure of how much of the available bandwidth can actually be used, taking into consideration dropped packets and retransmission due to congestion and protocol inefficiency.

Related information

WAN and disk emerging in remote office backups


Remote backup services get affordable


IBM retools for backup and recovery services


Best practices for offsite storage

To pick a data transport to fit your needs, first consider your storage requirements:

  • Distance. How far away do you need to keep a copy of your data?

  • Bandwidth. How much data must be moved and in what timeframe?

  • Recovery point objective. Can you afford data loss? At what point do you need to recover data from?

  • Recovery time objective. How quickly do you need to recover?

  • Latency. What are your applications' response time requirements?

When evaluating storage-over-distance network technology, keep the following items in mind:

  • Throughput. How much bandwidth is available and how much is needed?

  • Latency. What's the time delay and how does it impact the storage application?

  • Packet loss. How much bandwidth is lost due to data retransmission?

  • Variable bandwidth. What's the granularity of available bandwidth?

  • Budget. What are the initial upfront and recurring costs?

Storage-over-distance options:

  • Dedicated and dark fiber optic cabling and wavelength services

  • Wave division multiplexing (WDM, CWDM, DWDM)

  • SONET/SDH optical carrier (OC) based networking and packet over SONET (POS)

  • Metropolitan and wide area Ethernet services

  • TCP/IP based services, networks, and protocols including FCIP, iFCP, iSCSI

The more layers and protocols used in the network, the longer the latency and potential disruption to the storage applications. Understanding what layers are involved in the network is important because each layer adds complexity, cost and latency.

Diverse network paths are critical for uninterrupted network service. Make sure your network provider can guarantee diverse network paths not only through its core networks, but its partners' core networks. Secondly, determine how the service provider will manage and guarantee network performance (low latency and effective bandwidth).

A common mistake is to look at bandwidth simply in terms of dollars per Gb/sec. The effective or actual usage amount is important, and with bandwidth that includes what level of utilization at a given response time (latency level) can be maintained without congestion and packet delay/loss.

Avoid the mistake of prototyping a storage application at a reduced workload, and then assume that heavier workloads will scale linearly with regards to bandwidth and latency. Instead of linear scaling, effective bandwidth can drop off as workload is added along with additional latency, resulting in poor performance, particularly for synchronous-based storage applications. First and foremost, understand your needs and the capabilities of these different technologies.

You can read and learn more in the Storage magazine article "Bridging the gap" by Greg Schulz.

For more information:

Controlling the storage sprawl

About the author: Greg Schulz is a senior analyst with the independent storage analysis firm The Evaluator Group Inc.

Dig Deeper on Data storage strategy

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.