Experts at DCI's Summit on Storage Management conference being held here this week say the absence of a single management console, proper training and lost data costs will add up to major headaches for administrators.
"Up until now capacity planning has been ad hoc. Administrators have given storage to those who demanded it," said Evaluator Group partner and SearchStorage expert Randy Kerns.
Kerns was alluding to one slice of his six-tiered storage management structure, storage resource management (SRM). Kerns notes SRM is increasingly confusing to end-users and counts 28 vendors that have SRM products in the marketplace or in the pipeline.
"All of these products with different names for the same gee-whiz feature are causing definition problems," said Kerns.
To cut down on the number of options, David Tipple, project leader in corporate databases for Ottawa-based, Statistics Canada, is considering consolidating his operation from a distributed environment to a centralized one.
"I have two guys running two different systems. They can't talk to each other. If one guy goes on vacation, there is a problem running the other system," said Tipple who is in charge of Canadian statistics such as the country's census.
"Training people is going to be a problem. If you have 20 different tools,
The other five areas of storage management in Kerns' structure are device management (individual devices), virtualization, storage area network management, management framework (e.g. CA's Unicenter, HP Openview, IBM Tivoli) and application management.
Experts also say staffing considerations are becoming more important to storage management in these times of shrinking budgets and growing data. According to a Merrill Lynch/McKinsey report presented by Fred Moore, president of Horison Information Strategies, 47% of direct-attached storage costs come from labor -- compared with 13% for network-attached storage (NAS) and 12% for SAN. Moore sees money shifting from labor to software in the fabric-attached market to aid automation and shrink instances of human error.
But, human error does not lead to the greatest amount of data loss, a stat that surprised Moore. According to Moore, that distinction falls to the hardware. Forty-four percent of data loss is attributed to hardware, 32% to human error, 14% software, 7% by virus and 3% by natural disaster. But, even though hardware is the highest, it may not be the most costly.
"What's not known is the impact of losing data. Data lost by human error and software usually leads to corrupted data. Data lost from hardware can usually be restored, but there is no way to account for how much lost data can cost you," said Moore.
At the end of the day, users say they just want to manage their storage efficiently and keep the company flying in the right direction.
FOR MORE INFORMATIONFeatured Topic: Storage Management 2002 Four contributors advise on users' SAN management wishes, challenges, SAM Best Web Links: Enterprise Storage Management