My company has about 20 plus remote sites that currently maintain their own server backups. They have the software (Backup Exec 9) and hardware on site.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
We are looking at trying to centralize all of our backups. I have done a couple of trial runs doing backups across the wire and it is dog slow! I did this by using the Backup Exec software and backing up to a tape unit we have here in our IT department. As I said this was dog slow and would never work for all 20 sites.
I am just looking for any ideas or suggestions on how I could make this work. Thanks for any help.
Thank you for the question. Your situation is not usual as many enterprises are trying to consolidate
and get control of remote locations of data. I believe the goals that you have are as follows:
1. Eliminate the need for local tape devices at the remote sites.
2. Provide for
management of the backup process and associated resources.
3. Hopefully, deliver a higher level of backup functionality. For example, increased successes of backups, ability to restore on demand and elimination or reduction of the ever present backup window.
has some great advantages in doing traditional backups, but the main issue is that most backup applications do not work at the block level. Rather, they work at the file level backing up a full set of files regardless of the date or time that the data may have changed at the block level.
To understand this, imagine a database table in which a single record is updated, not the whole database. In this case, the only change of the data would occur on the table entry and the rest of the database would remain unchanged. The backup application doesn't know that the table entry changed, it sees that the database tables date has changed and thus marks it as a new file for backup not understanding that only a small part of the actual file has changed. In this fashion, the underlying blocks in the file have only changed a small percentage so you could backup only those change blocks and in effect have a full copy that is up to date of the file.
There are a variety of solutions on the market that will help to provide a kind of a solution. One of the best that I am aware of comes in the form of an open systems snap vault solution. This solution offers the ability to backup data based on the changes in the data itself, not the complete data set. It works using your current networking technology and can be used in conjunction with a
subsystem for staging backups in the form prior to archiving them to a tape device for long term storage. The application requires agents on each host at the remote sites as does a backup application, but it makes a full backup each time it runs by doing a diff based on what has changed from the last time a backup was run. Unfortunately, this will mean purchasing a new software solution for your environment, but in the end the purchase price will be easily offset by the savings on management and the reduction in overhead due to the current backup architecture.
There are several other solutions on the market in the form of
for caching and compressing the data on the remote side of the environment to send it to the central facility but today these solutions have proven expensive and disruptive to implement.
Should you require further assistance or have additional questions or concerns please feel free to let me know.
Editor's note: Do you agree with this expert's response? If you have more to share, post it in one of our