I have two questions I'd like to ask.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
1. How can I determine that a properly and/or statistically designed backup system would fail far less than 1 in 10 times.
2. What would you recommend for a reconfiguration or redesign of the system. How can this be accomplished so that it would not affect critical systems and applications.
1. That's just an accepted fact. A properly designed system should indeed fail FAR less than 1 in 10 times. That's simply an unacceptable failure rate. The backup systems that I implement, for example, experience some level of failure less than 1 in 100 times, usually less than that. Also, the failure that they encounter is usually automatically worked around by the system. For example, a tape write fails. The backup is automatically retried.
2. I always recommend that this be done by a professional consultant specializing in backup and recovery. This industry is now much more complicated than it used to be, and it is rare that a fully-functional, error-free backup system is implemented without professional help. There are simply too many variables and too much technology to keep up with.
Editor's note: Do you agree with this expert's response? If you have more to share, post it in one of our .bphAaR2qhqA^0@/searchstorage>discussion forums.
Dig Deeper on Data management tools
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.