This article can also be found in the Premium Editorial Download "Storage magazine: What to do when storage capacity keeps growing."
Download it now to read this article plus other related content.
DR plan is too expensive
Some deep-pocketed enterprises have taken an approach, at least for centralized application data, that is simple but very expensive: All data is placed on the highest cost storage tier and replicated locally and remotely for rapid recovery in almost any failure scenario. For organizations with well-endowed storage departments in tightly regulated industries, the obvious next step is more of the same. These companies can continue to treat all data the same and spend more money to get more protection.
However, if capital resources are limited, there's an alternative approach. Recognize that all data isn't the same and manage different data classes in terms of their actual business requirements. In such an environment, 10% to 20% of data is typically underprotected, while 40% or more is overprotected.
A good data classification program can identify critical data that deserves a higher level of protection. It can also pinpoint data sets that are protected and replicated too much, enabling you to redeploy existing resources and meet data growth requirements with less-expensive infrastructure. If this is done well, most enterprises can improve protection, performance and recovery for the most critical data while reducing overall storage costs.
This was first published in June 2006