The events of Sept. 11 have caused many IT decision makers to rethink their organization's disaster recovery plans....
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
According to Network World's annual IT spending survey, IT executives' spending priorities have shown a shift towards disaster recovery, security and videoconferencing. Specifically, when surveyors asked which specific areas will receive more money in 2002, 45 percent of its 598 IT respondents answered with 'disaster recovery'.
Now that the extreme importance of having a complete disaster recovery solution has been established, let's move on to what IT executives need to consider when deciding how to protect their organization's most critical information, data.
A crucial component within a backup and disaster recovery plan that's often overlooked is the importance of backing up files that are open and in use. One of the biggest problems associated with getting complete and accurate copies of a server's data is ineffectively backing up open files.
These days, 24/7 access to mission-critical applications such as Web sites, databases, and messaging systems is essential. With that in mind, as organizations and the amount of data continue to grow, trying to schedule backups during periods of non-activity has become nearly impossible for IT administrators.
Because a lot of these mission-critical applications are in use 24/7, consequently, they don't get properly backed up. To lose such essential data can be disastrous for any organization.
Many backup software packages will skip open files initially, and then go back and try to access them again at the end of the backup. In many cases, however, the files are still open and therefore not backed up. Even if the backup software is able to access a file in a subsequent attempt, that file will not be synchronized with other related files, which can ultimately lead to corrupt data on tape.
In another scenario, the backup package could force the backup of open files. If any changes occur to a file during the backup, parts of the changes might be captured by the backup software, while other parts may be in areas of the file that have already been read for backup. This will also lead to corrupt data on tape. Some backup packages offer an open file solution through the use of application-specific or generic agents.
Dedicated application agents are available for a handful of database and email applications. They are designed to work with specific versions of the applications and only with a particular backup software program. Using application-specific agents can be quite costly, as agent upgrades may be necessary when deploying updated versions of applications and/or backup software.
Generic agents are intended to work with all applications, but are generally only compatible with a single backup package. Changing versions of backup software often requires customers to obtain updates to their generic agents from the vendor in order to continue using those agents in the new environment. Additionally, generic agents are often limited in their synchronization capabilities, providing file-by-file or volume-by volume synchronization. This allows for compromised relational integrity of databases that span several volumes, leading to corrupt data on tape. In some cases, administrators must manually identify groups of related files that need to be handled in a synchronized manner.
The solution to protecting open and in use files during the backup process is finding a complementary utility that helps the backup software successfully capture open files, without locking users out of applications.
About the author: John E. Jones is CEO and president of St. Bernard Software. For more information or a free trial of St. Bernards' Open File Manager, visit www.stbernard.com or call 1-800-782-3762.