Storage performance testing can be performed in-house by establishing a test setup designed to simulate the workload and traffic patterns expected by a system under test. Tools to test specific applications are readily available from software vendors. For example, Microsoft provides free tools like Loadsim and Jetstress to create a simulated Exchange environment. Some storage administrators prefer more general tools that can simulate workloads and gather operational data across a wide range of hardware/software platforms -- Iometer is one popular open source tool that is freely available for use. If you'd prefer to have testing performed by an outside organization, independent labs like Diogenes Analytical Laboratories Inc. and Lionbridge Technologies Inc. might serve your needs.
Versatility and cost-effectiveness are both addressed in Iometer, a freely available open source I/O subsystem measurement tool supporting both single and clustered system configurations. Iometer was originally developed and introduced by Intel Corp. in early 1998 but was released to the Open Source Development Lab in 2001. Today, Iometer is maintained by an independent team of developers from around the world. Their goal is to establish Iometer as a versatile cross-platform testing tool. "We support Windows, Linux, NetWare, MacOS and Solaris," says Ming Zhang, senior storage architect at Tandberg Data and one of Iometer's maintainers. He notes that Iometer supports a variety of processor platforms, including 32- and 64-bit Intel and AMD processors, SPARC processors from Sun Microsystems and PowerPC processors from IBM. A complete compatibility matrix can be found online at the Iometer Web site.
Iometer software essentially serves two purposes. First, it provides a workload generator that can simulate a wide range of predefined traffic patterns or tune to support custom workloads that most closely match your particular environment. Although it's important to understand the type of workloads that you need to simulate, Zhang says that Iometer is actually quite easy to configure. "Even a new user, if they have some storage knowledge, can configure Iometer in 10 minutes at the most." Second, Iometer logs test results for further analysis. While Iometer offers a dashboard-type display for performance numbers during testing, it does not offer internal analytical features. Consequently, performance data is usually exported to Excel for detailed examination and charting.
As with most test tools, Zhang points out that Iometer does not include the intelligence to determine whether test results are good or bad -- an IT team performing the tests must thoroughly understand the performance objectives of a system under test and have enough knowledge to interpret Iometer's performance results. Data analysis can be problematic for smaller organizations with limited staff or expertise and may sometimes require the support of a storage consultant.
Diogenes Analytical Laboratories Inc. is one small privately held startup specializing in vendor-neutral IT consumer advocacy. "We don't accept money from vendors to do published research, and we also actually test products in the lab," says Phil Goodwin, president of Diogenes. "What we try to be is the eyes and ears of the IT buyer." Storage readers may recognize the analytical work of Diogenes Labs in Storage magazine's Quality Awards.
For Diogenes, the emphasis is on comparative testing where groups of products are tested within the same category, such as disk-to-disk products, SRM tools, backup/recovery software and so on. "We may do eight products or 10 products in the category," Goodwin says. However, Diogenes will also apply its engineering expertise to test specific systems as mandated by its IT clientele. Most testing projects can be accomplished by one or two engineers over six weeks or less -- the exact timing depends on the scope and complexity of the project. According to Goodwin, the cost for a Diogenes testing project can be as low as $10,000 but generally ranges from $15,000 to $25,000. This includes establishing objectives, defining the test protocol(s), conducting the tests and presenting the results.
It's important to note that testing cannot take place in a vacuum, and Diogenes engineers will typically work with one or more members of the client's IT staff to understand the storage environment, establish test criteria and report on progress. At the conclusion of a project, the client receives a presentation thath specifically addresses the questions or objectives of the testing process. "What they should expect is really a distilled version of the bottom line," Goodwin says, also noting that clients have complete access to all of the test data.
Goodwin urges prospective clients to look for expertise in any testing provider and scrutinize the relationships between a provider and a vendor to ensure testing integrity. "They [service providers] should be willing to disclose those relationships," he says.
Getting ready for market
When a vendor designs a new product, much of the testing is accomplished at a relatively low level. That is, the vendor knows how the new device will transfer data, but a vendor may not be positioned to run the device through challenging real-world performance tests with applications like Oracle or Exchange. Larger outsourced service providers like Lionbridge Technologies Inc. (through its VeriTest branded service) cater to the outsourcing needs of organizations seeking to evaluate new or updated storage products and push those products to their performance/scalability boundaries.
Jennifer Turnquist, storage service line director at Lionbridge, explains that vendors often struggle to test their products under large-scale deployments or large volumes of storage. Lionbridge touts the ability to simulate a very large number of physical clients (e.g., over 1,000 clients) and provide high volumes of storage to accommodate the rigors of product boundary testing. "We've done some testing that has required 100 TB [terabytes] of storage," Turnquist says. "Vendors don't usually have 100 TB just lying around that they can set aside to do testing with."
Costs can vary dramatically depending on the test scope and requirements. According to Lionbridge, a one-time benchmark analysis can run $40,000, while annual functional performance testing can cost several hundred thousand dollars. Most typical engagements vary from $75,000 to $150,000 per instance. Once a project is completed, the client receives a detailed analysis of the results and access to the entire data set, (including log files and output files from testing tools. "We try to give them [clients] some real actionable information that they can go and make some improvements or enhancements to their product," Turnquist says.
In any kind of testing, Turnquist says that goals and objectives are the most important things to consider. "It's often easier if you know what the target is," Turnquist says. "Make it a business decision rather than a technical decision."
Go to the next part of this article: Storage performance testing: User perspectives
Or skip to the section of interest: