Terabytes per storage manager: An important metric
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
One way to measure the effectiveness of a storage management operation is to look at the amount of data managed per storage management employee. This is significant since trained storage administrators are in short supply and employee costs are a major factor in the cost of managing storage.
Data managed per employee is also a moving target. According to market researcher IDC (www.idc.com) this figure has increased steadily over the last several years and should continue to increase in the future. IDC says that in 1997 the average storage manager handled about 750 GB of storage. In 2001 another IDC study found that the average employee was handling about 1.3 TB of storage. By 2004, the study indicated, the average employee would be handling about 5.3 TB of storage.
IDC also found that this figure can be increased significantly by using sophisticated storage management architectures that can manage storage centrally. In one case, the company found that using such an architecture from EMC (www.emc.com) a company was able to manage 12 TB per person and IDC estimated that by 2004 the same architecture will allow the company to manage up to 48 TB per person.
Rick Cook has been writing about mass storage since the days when the term meant an 80K floppy disk. The computers he learned on used ferrite cores and magnetic drums. For the last twenty years he has been a freelance writer specializing in storage and other computer issues.
Reader Scott R. Edenstrom had the following comments about this tip:
I think a big point is missing from this tip. We may be handling twice as much, but how effectively? Most open system environments (such as NT and UNIX) grow on a daily basis. New servers just appear with no plan to add additional support personnel. Somewhere somebody will read this and think their people aren't doing enough if they can't handle at least 1.3 TB. We have no choice but to handle more, but are we really doing the things that need to be done? Is the reason storage is growing so much partly due to the fact that we aren't managing it but merely adding servers to support new applications? Tools for managing open systems data are still on the cutting edge (archiving for example), and the tools that are available aren't cost-effective to put on every server.
What did you think of this tip? Email and let us know.
Information Storage and Retrieval Systems Theory and Implementation, Second Edition
by Gerald J. Kowalski and Mark T. Maybury
Online Price: $115.00
Publisher Name: Kluwer Academic Publishing
Date published: September 2000
This book provides a theoretical and practical explanation of the latest advancements in information retrieval and their application to existing systems. It takes a system approach, discussing all aspects of an Information Retrieval System.