Storage vendors strut around with their benchmark results like roosters through a henhouse.
But like the chicken who has already laid her eggs, users are not impressed.
There is no question that benchmark results can be impressive, but users say that unless they run the system in the same exact configuration as vendors used to get their results, the numbers are meaningless.
"We prefer to enter into a long-term partnership with a major storage vendor based on factors in addition to performance, so benchmark results never really come into play," said Ray Ball, a senior technical analyst and storage area network (SAN) administrator with toy maker Fisher-Price Inc., East Aurora, N.Y.
In recent weeks, a number of major storage makers have paraded their performance results. Computer Associates International Inc. and Microsoft Corp., for example, last week jointly announced a "record-setting" 2.6T byte/hour archival backup of a Microsoft SQL Server 2000 database running on a Windows 2000 Datacenter Server. Weeks earlier, Hewlett-Packard Co. claimed "record-breaking performance results" for its StorageWorks Enterprise Virtual Array (EVA) based on the SPC-1, an industry benchmark standard.
Vendors often use benchmark results as sales bait, but many users refuse to bite.
Peter Litchford, a senior partner with Rowland Litchford Associates, Winnipeg, Manitoba, said he doesn't use benchmark results as criteria for choosing storage systems. What he does consider is capacity, manageability and security.
"Since actual speed attained depends so much on the way in which the user works, we do not feel that a good general benchmark approach is possible," he said. "User needs can change rapidly, often invalidating the data mix and access approach used in the benchmark."
Litchford stressed that keeping data accessible and online is the main problem, not access speed.
Roger Reich, founder of the Storage Performance Council and senior technical director of interface standards for Veritas Software Corp., expects users to utilize benchmarks to compare vendor systems and assess which has the best price and performance value.
In June 2002, the Storage Performance Council released a series of benchmark results based on the SPC-1 benchmark. The test measures the maximum input/output (I/O), total storage capacity read and written during the course of executing the benchmark, data protection level and the average system response time of a given storage configuration. SPC conducted the Computer Associates and HP benchmark tests.
"Some of the early results that came out were on configurations that were arguably small. The industry is just getting ramped up in using the benchmark," Reich said.
Reich said that in a few years the SPC will have a large repository of results that will point users in the right direction.
According to some experts, storage benchmarks are inherently subjective.
"Storage benchmarks are a useful data point but only one piece of the puzzle when making storage decisions," said Michael Fisch, senior analyst with the Clipper Group Inc., based in Wellesley, Mass.
Fisch said it is also possible to "tune" equipment to perform extra well on a test, which does not reflect its general fitness. But, he said, it's nice to know whether a box can go fast.
Jamie Gruener, senior analyst for the Boston-based Yankee Group Inc., said that user response to storage benchmarks is following a philosophy adopted in the server industry.
"Benchmarks are a checkmark item for the server industry. It's new for storage," said Gruener. "[Benchmarks] are another data point used to evaluate a vendor."
Gruener said that benchmark results should be one piece of the overall decision-making process. Benchmarks are a snapshot of real-world environments and should be taken with a grain of salt. The only way a user can be sure a storage system will perform as advertised, he said, is to test it in his existing IT environment.
Gruener said the storage industry is at a stage in which customers have the upper hand and can get vendors to prove their performance in the end-user environment.
Benchmark results have also fed the competitive fires in the storage industry, resulting in feuds that bordered on being ludicrous. There were a few rounds of mudslinging between EMC Corp., Hopkinton, Mass., and DataCore Software Corp., Fort Lauderdale, Fla., last September, when DataCore said it would give EMC a Porsche if its claims of DataCore's shoddy performance were accurate.
"You have vendor engineers that understand these boxes very clearly. They're going to use some special tricks here and there to boost performance," said Gruener.
Marc Farley, storage expert and author of Building Storage Networks, Second Edition, said that benchmarks can be very reliable and valuable, the caveat being that benchmarks must represent something similar to a real-world workload for the environment where the product is going to be running.
For instance, he said, a benchmark that measures throughput for a single high-volume client session may not be relevant for an environment with many clients performing sporadic and "bursty" data transfers.
Farley added that the details of the configuration and all tuning options also need to be understood. "You can't blame the people running the benchmark," he said. "You can't expect them to knowingly settle for poorer numbers if there are techniques that will boost performance."Let us know what you think about the story. E-mail Kevin Komiega, News Writer
FOR MORE INFORMATION:Benchmarking the benchmarks Group releases full storage benchmark results First storage performance benchmarks announced An alternative view of benchmarks and some useful downloads Read the CA, Microsoft white paper on why benchmarks are relevant Comment on this article in the SearchStorage.com Discussion forums