Here is another prime example of technology evolving to meet business requirements. I have always been a firm believer...
that, except for small IT environments, operating systems files should not be part of regular backups. This was especially true when using a full and incremental backup product. There are too many identical static files (DLLs and patches) across systems that end up using too much backup storage capacity.
Once again, this is where technology changes come to the rescue. Data deduplication has dramatically reduced the impact of backing up OS files on the backup storage infrastructure. Since identical data segments are not stored (only pointers) when using deduplication, it becomes much less of an issue at the storage level. However, if backed up, software still has to inspect every OS file for changes and send them across the network even is only a single byte has changed since the last backup. There is still the issue of the registry that makes a traditional backup of the OS files questionable.
Using the replication component of server virtualization technology or ghosting, OS and registry replication, or clustering and system recovery products is still a better alternative then restoring OS files or reinstalling the OS. Furthermore, these systems images can now occupy far less storage space when combined with deduplication technology.
Check out Pierre's answer to this question from last year.
Go to the beginning of the Disaster Recovery FAQ Guide.