Storage performance testing (SPT) may be called for when there is a pressing question that a storage administrator needs to answer. Sometimes those questions can be answered through in-house performance testing. In other cases, outside testing makes more sense. And smaller organizations typically rely on vendor expertise to address storage bottlenecks and help optimize configuration issues. As you read in the last section, each approach offers its own unique advantages -- and each imposes potential drawbacks.
The hardest part of SPT is to determine its need in the first place. Before any action is taken to perform testing, an administrator has to identify an issue with the storage system in question. Analysts agree that storage performance testing should be avoided unless you are certain that storage is the problem -- SPT is not a silver bullet to solve general performance issues. "You need to understand your servers, your software, your network and your storage infrastructure," says Brian Garrett, industry analyst with the Enterprise Strategy Group. "If the performance problems that you have are not [related to] storage, then it's a silly exercise." For example, oversights like inadequate server memory or network congestion should be identified and corrected before any performance testing project is considered.
Doing it yourself
A test environment must be created for the system being tested. Tools must be implemented to simulate workloads and
The challenge is to establish a test environment, or "test bed" that is representative of the actual production network. In extreme cases, this might require a complete replication of the current setup. For most environments, however, a test bed can be established with far less hardware. The system being tested can usually be coupled with several servers, some connectivity/networking equipment and a variety of workstations. It's important to point out that there is no single or universal test setup for any piece of storage equipment -- every setup is different because every storage environment is different. Even an established test bed may need modifications to run different experiments or test variations.
Hardware should support the operating system(s) deployed across the real storage environment, and run applications that are most critical to the enterprise. Other tools, such as agents or sniffers, may also be needed to collect performance information for further analysis. "You need something to create the traffic that you want to measure or improve," Garrett says. "Using the real application [e.g., Exchange] and trying to simulate traffic over that is the very best kind of benchmarking for storage." It may be impossible or impractical to implement actual applications in the test environment. So many organizations rely on test software, such as Iometer, which are capable of simulating workloads and common traffic patterns, and collecting the resulting performance data.
Once a test configuration has been established, it's necessary to design a series of experiments that will help to evaluate the storage issue. Analysts note that test objectives can vary widely depending on testing goals. "Is it to test a particular workload or particular application?" says Greg Schulz, founder and senior analyst at Storage IO. "Is it to stress test it? Is it to find out how much workload you can put on it before response time degrades or before something breaks?" With clear test objectives identified, applications or workload generators can be configured accordingly. For example, a workload generator like Iometer might be configured for random writes, sequential reads or a mix of storage activity.
Is it better to use an outside consultant or service provider to handle your storage performance testing? In-house testing assures complete control over the test environment and guarantees the confidentiality of any results. But the demands of in-house testing can sometimes overburden smaller IT organizations. The actual choice of venue depends on your own testing needs, but there are some compelling advantages to outside service providers.
First, there's a matter of priorities. Evaluating or comparing new storage products generally doesn't carry the same urgency as value-added tasks, such as integrating a new feature with the existing network, or just keeping the data center running on a day-to-day basis. Consequently, many busy IT departments relegate product comparisons and evaluations to low-priority or "time available" status -- often extending the testing time far longer than otherwise necessary.
There is also an issue of resources. Companies rarely have an extra few terabytes of storage or Fibre Channel SAN equipment sitting in a lab for testing purposes. Then there's the question of capabilities and expertise. Accurate testing requires a working knowledge of the platforms being tested, and even a knowledgeable IT staff may not have the scope of expertise to adequately test or compare new hardware devices -- especially devices from unfamiliar vendors.
In many cases, commissioning an outside service provider can overcome these limitations. Outside service providers can commit to a target date for completion, returning results in a matter of weeks rather than months. Since testing is the main business of an independent laboratory, their storage and hardware resources are typically dedicated to testing tasks. The staff of this type of lab typically has significant engineering expertise and experience with the system types being tested, which usually result in better configuration and more comprehensive test plans.
While an outside firm can bring a level of objectivity to any testing process, there is also a concern about pre-existing relationships with vendors. Analysts typically agree that any type of relationship between a testing firm and vendor can taint the results -- even if the perception is unfounded. "An organization positioned to help businesses [evaluate hardware] should not align itself with a vendor," says Ashish Nadkarni, senior consultant at GlassHouse Technologies Inc. "I think that's very hard to find." For example, if a testing house accepts money from a vendor to perform testing, the results of that testing are potentially suspect.
Weighing the costs
Whether you opt to allocate hardware and staff in-house or engage an outside firm to handle testing for you, testing is going to cost you money. Any testing project should be preceded by a careful justification of those costs. For example, is it worth investing the time and resources to test the performance of two or three different storage systems if you're only going to buy one of them? Are the test results going to cost more than the acquisition or upgrade? "Testing is expensive," Schulz says. "You need to have a good reason why you're doing it -- and will the reason offset the cost of it?"
Go to the next part of this article: Storage performance testing: The vendors
Or skip to the section of interest: