Access "Big files create big backup issues"
This article is part of the Vol. 7 No. 3 May 2008 issue of How to plan for a disaster before a software upgrade
The following tips will help lessen your big file backup problems. It's a long-standing problem: As data piles up on a server, completing a successful backup becomes harder. Backup apps become bogged down with millions of files to examine, and network and CPU limits can stall throughput when transferring a gigantic file. Even if a backup job is successful, the data in a large file may have changed in the hours it took to create the backup image. Vendors and users are now applying new ideas and technologies to ensure that no data set is too big to back up. The rapid creation and accumulation of stored data has pushed traditional backup approaches to their breaking point. Large amounts of storage capacity and advances in processing power have led users to believe that a virtually unlimited amount of data can be stored and protected, but most backup managers will admit that it's just not so. While tape drives have become larger and faster, and new technologies like LAN-free and disk-based backup have reduced the load, the old approach of "scan everything every ... Access >>>
Premium Content for Free.
- All thin provisioning is not created equal
Tools to test your DR plan
by Robert L. Scheier
Periodically testing a disaster recovery (DR) plan is essential, but it can be a time-consuming and expensive task. New tools that check DR configurations and constantly monitor your site's readiness to recover from a disaster can cut costs and testing time, and provide a level of confidence that your DR plan will actually work when it's needed.
Big files create big backup issues
Big files and millions of files clogging storage systems can create big backup headaches. While there's no quick fix to the problem of big backups, there are many effective approaches, including adjustments to your backup process and newer technologies from backup vendors.
- Hard disk drives become more affordable
Ask the Expert: NFS vs. CIFS
What should you consider when choosing between Network File Sharing (NFS) or Common Internet File System (CIFS)?
- Our View: NetApp plays name game
- Solid-state storage nears prime time
Automate storage management
by Rich Bourdeau
IT process automation tools provide workflows that can help automate manual storage management processes. The real value is when these workflow engines are integrated with storage management apps to not only guide administrators through the process, but provide them with information to make intelligent decisions and automate some of the more basic tasks.
- Do you plan to eliminate tape from your backup process? If so, when?
- InfiniBand still a longshot for storage
- Symantec users find improved licensing
Legal toolkit for storage systems
Storage managers may be reluctant to admit it, but they and the storage systems they manage are key players in most companies' compliance and legal readiness procedures. While ediscovery is the current buzzword, there's currently no all-encompassing ediscovery tool on the market. But you can assemble an effective toolkit with some of the point products that are available now.
- Power still not top of mind for storage pros
- DR testing not routine for all businesses
- eDiscovery By The Numbers
- A blade new world for storage
Storage Bin 2.0: Virtualization: It's not just for breakfast anymore
Storage has never received the glory for all of its virtualization efforts because the server side of the shop always seemed sexier, even though forms of storage virtualization have been around for years. But server virtualization and consolidation efforts might push storage virtualization into the limelight.
Best Practices: Storage provisioning steps to keep your infrastructure healthy
To avoid inefficiency, the storage provisioning function must be brought into balance. Users must consider changes to process, responsibility and ownership, and identify areas where newer technologies can help.
- Pivotal time for storage (Editorial)
Hot Spots: Just say 'Yes' to a new IT strategy
by Bob Laliberte
Software will increasingly transcend the self-imposed technology barriers that have evolved in larger data center environments. The ability to create policy-based programs that not only automate processes, but empower others to help themselves, will dramatically improve efficiency.
- Storage Bin 2.0: Virtualization: It's not just for breakfast anymore
More Premium Content Accessible For Free
One of the biggest challenges of building a virtual server infrastructure is fine-tuning the storage that supports the virtual machines. Having ...
Cloud storage, virtualization and the growth of unstructured data have contributed to the way storage architectures are built and used. Virtual ...
While the concept of data archiving has existed for decades, archiving practices that were once considered standard are becoming inadequate. Factors ...