Tiered storage data migration toolsTiered storage data migration tool purchase considerations <<previous|next>>
Criteria for purchasing data migration tools
For example, unneeded data located on high-performance Tier 1 Fibre Channel drives can be migrated to a nearline SATA disk array. Later on, that data can be moved to a fixed-content archive system, VTL or tape library. In other cases, the data migration process is used to move information from an old storage system to a new platform.
Choosing the proper data migration tool isn't easy. There are many tools out there, and to choose the right one you'll have to understand the issues involved in any tiered storage acquisition, then evaluate factors such as transparency, interoperability with storage systems and data, complexity, policy enforcement and regulations related to retention and compliance. This article lists the criteria for purchasing data migration tools, and also offers specifications to help you compare products from data migration tool vendors such as Brocade Communications Systems Inc., CA, EMC Corp., Hewlett-Packard, Incipient Inc., Network Appliance Inc. and Symantec Corp.
Criteria for purchasing data migration tools
How does the data migration tool behave? Some data migration tools copy data from the source to the destination, then mirror writes to both locations until a final "cutover" occurs. You may prefer this if the migration will take place in phases or is intended to support a new or upgraded storage system. Other tools simply move the data to the target, then delete that data on the source. This process migrates the data without using additional storage, but makes it more difficult to "undo" the migration if problems arise. Some tools can even throttle their migration to avoid excess network traffic. Lab testing can help to identify any idiosyncrasies up-front.
How does migration impact storage performance? Data migration will impact the performance of your storage subsystems or storage fabric. This in turn may affect storage service levels or application availability. When evaluating a migration tool, measure storage performance with and without the migration process running. Performance will suffer most when moving a substantial amount of data (e.g., bringing a new or upgraded storage system online). An application that accesses data on one disk array may be unavailable until its data is migrated and the application is configured to access the new destination. For example, a database may need to be quiesced until a migration is complete. Data that is migrated on an ongoing basis, such as daily or weekly moves, typically has a smaller overall impact.
Where will data migration be handled? Data migration tools can be host-, network- or storage-based. Most host-based tools run on a dedicated server, are storage-agnostic and can migrate data in the background, which allows for a greater range of storage system options. Network-based data migration is generally a feature of intelligent switches, making migration a function of the storage fabric itself. This approach is more robust, but potentially limits interoperability between switches and storage systems. Storage-based data migration occurs within the storage subsystem itself, such as a Hitachi Data Systems' TagmaStore. Storage-based migration is the most subtle but least interoperable approach.
Evaluate file and block-level tools. Data migration tools can operate at the file and block levels. Experts recommend the use of block-level data migration tools, but file-level tools are good in situations where storage is over-allocated to servers. For example, a Windows server may use 300 GB of a 500 GB volume, but a block-level migration tool sees the 500 GB volume and can only move files to another volume of equal or larger size. This can perpetuate inefficient storage provisioning. By comparison, a file-level data migration tool can move the 300 GB of files to a volume of that size or larger.
Weigh the tool's policy and automation features. Although data migration is often performed manually, it is not intended to be a manual process -- especially if data must be moved on an ongoing basis. Automation features can keep labor/administration costs to a minimum, so examine them closely. Some data migration tools employ policies to move certain data types to preferred storage. For example, a migration tool may move .MP3 or .JPG files directly to nearline storage, or offload any file that's been idle for more than 60 days. Brocade's Data Migration Manager (DMM) can automatically create migration pairs, implement automatic zoning and can use a scheduler feature to perform tasks at pre-set times. A policy simulation feature in CA's File System Manager helps administrators understand the effectiveness of policies before actually moving data. Incipient's Network Storage Platform (NSP) provides automated provisioning.
What role does storage virtualization play? Finding and allocating physical storage is an administrative challenge. Virtualization adds a layer of abstraction between applications and storage, allowing adminis to pool and allocate storage without regard for physical locations or hardware platforms. This can simplify data migration because applications are no longer coupled to specific arrays or other storage devices. However, virtualization adds its own complexity to the storage environment; consider this before deploying a data migration tool.
29 Jan 2008