The problem with this method, according to the Storage Network Industry Association (SNIA), is that it can significantly understate the true losses. First, the "cost of downtime" doesn't fully capture the cost of the loss of use of the data, including productivity and business losses; second, it doesn't include the risks inherent in the loss of data.
Instead, SNIA suggests that you measure the cost of protection against the total value of the data (downtime, business and productivity losses, plus the cost to rebuild the data) times a risk factor of 10%.
The long-term goal, according to SNIA, is to divide data into classes, put a dollar value on each class, and classify risk factors by actuarial parameters. Establishing standards for these parameters is one of the long-term goals of the SNIA's Data Protection Initiative. Meanwhile, the value-of-data approach provides a more realistic way to cost data protection systems.
Rick Cook has been writing about mass storage since the days when the term meant an 80 K floppy disk. The computers he learned on used ferrite cores and magnetic drums. For the last twenty years he has been a freelance writer specializing in storage and other computer issues.
This was first published in June 2004