A data migration project can be complicated and time-consuming, but if you do the necessary preparation work and fully think through the process, you should be able to avoid some common data migration errors. Bob Laliberte, a senior analyst at Enterprise Strategy Group in Milford, Mass., offers up data migration tips, discusses the types of tools that enable data storage process automation and points out some of the challenging scenarios that users may confront. Read his answers to these frequently asked questions about data migration projects or listen to the MP3 below.
Download for later:
Data migration: Fix bad habits before moving data
• Internet Explorer: Right Click > Save Target As
• Firefox: Right Click > Save Link As
Probably the first one I always start with is: Don't migrate your bad habits. Everyone creates their own environment, and on Day 1, it's perfect. But over time, as changes are made, as things get added in the heat of the battle, things may be put in that aren't quite as efficient or as optimized as they would like. So, the first thing that I always try and highlight is: Make sure that you fix those when you get that opportunity. When you're doing a data migration you're able to put your storage state back into an optimized one and not one that's really inefficient. Make sure you remediate any orphaned storage and inefficient uses of storage and get that back to a clean state.
The second is probably being able to leverage your automation. So, instead of having a manual process, make sure that you look for data migration tools that allow you to accelerate your ability to move that data and do so without manual intervention.
And lastly, make sure that you cover all your bases. A lot of times, people are just focused on one particular piece of the data migration, and what you really need to be more aware of is everything from end to end; everything from the infrastructure and software through traffic and weather.
I think, certainly, there have been a lot of advances in data migration, and you really need to look for tools that help automate that discovery process as well as the migration process. Tools that help you figure out from a server level down to the storage what can really help to increase your flexibility.
From a storage perspective, [tools that enable moving] data from one array to another, from thick to thin, heterogeneous environments, would all be very important today moving forward if you're going to do a data migration and especially if you're moving from one vendor to another. And those typically tend to be network or host-based environments.
Sticking with "the same way we always did it" and that manual effort really leads to a lot of human error and a lot of wasted time. So one of the biggest things you always hear is "My project went over budget, over time." And it's a lot of those manual efforts that have led to that.
The other piece of the mistake is copying the environment exactly as it is. If you're not reclaiming that orphaned data or trying to move to a more efficient platform, then you're going to get a lesser return on that migration and technology refresh.
And lastly, it's probably taking on too much risk. For those organizations that are just flying by the seat of their pants, and moving the data and spinning it up and hoping it all works, that can lead to a lot of outages and a lot of uncomfortable meetings at an executive level. Ensuring that you've remediated effectively is going to be an important step that you need to take in performing a data migration.
There are a couple environments that obviously create additional challenges, and I think the first one is probably going to or switching to a different vendor. You may not have the tools that enable you to migrate from one vendor to a different vendor. Heterogeneous data migration tools are going to play an important role there. That's going to lead to some headaches.
And then, the larger the environment, the more complicated it's going to be. Certainly, although there's a lot of media attention to data center transformation, when you're going from a physical environment and monolithic arrays to a whole brand-new, fully virtualized, highly dynamic, thin provisioned storage environment, obviously the end result is going to be great. But getting there is going to be pretty painful. When you're trying to do that, the process and end result will be worth the effort, but that's going to be a much greater effort than just pulling out an array from the same vendor.