SAN FRANCISCO – Regular testing is necessary for keeping hardware, software and business processes tuned for optimal performance, speakers emphasized during the opening day of Storage Decisions.
Brian Garrett, Enterprise Strategy Group lab technical director, provided recommendations for testing and tuning storage systems, while Glasshouse Technologies storage consultant Jeff Harbert covered the challenges of testing an organization's backups and restores.
Garrett told attendees that the most important task for testing and tuning is to balance workloads over the right number of drives protected with the appropriate RAID level. "The disk drive is the slowest component in a computer," he said. "Spreading the workload over a large number of drives can solve most performance problems."
He recommended testing when evaluating a system for purchase, changing an existing system or determining if the system is well-tuned after a change. Different tests can be used to determine which system is faster among like systems, which system does more work in parallel and which system has the best price/performance.
Transactions, response time and IOPS
The key metrics to measure are transactions (i.e., number of applications that can be supported at an acceptable response time, transactions per second, email database operations per second, file operations per second), response time (speed of each I/O measured in milliseconds), throughput (how much data can be moved through a storage system) and IOPS, Garrett said.
Measuring IOPS, he said, "doesn't tell you about real-world applications, but gives you an idea of the storage engine of a system."
Garrett suggested that when looking to buy a storage, an admin should request industry-audited benchmark results from vendors. He also said that price/performance is more important than raw performance numbers.
Some good testing tools were available in products, according to Garrett. For example, bandwidth monitoring capabilities built into Fibre Channel switches from Brocade and Cisco indicate whether an upgrade from 4 Gbps to 8 Gbps devices will help. "They tell you how much bandwidth is being used," he said. "If it's about 15%, you probably wouldn't benefit from going up to 8 Gb. If it's saturated at around 60%, you'll get more bang for your buck from an upgrade."
He also pointed to storage management products, such as Akorri's BalancePoint and Virtual Instruments' NetWisdom, as useful for monitoring performance of systems in-house.
Backup testing: Don't forget restores
Jeff Harbert said that backup testing is a crucial step in making sure critical data is protected, although many organizations don't do it enough. . .or at all. While backup testing is not as complicated or expensive as disaster recovery testing, it does require a concerted effort from administrators and management buy-in, he said.
"Backup testing is not easy," Harbert said. "Recovery testing is not easy. It requires a lot of resources, and most people don't have the time to do this."
Harbert advised testing the entire backup and restore process to make sure that data can be recovered in case of an outage. It's not enough to know the data was backed up successfully. "If 50% of backups failed, but 100% of restores worked, would anybody notice?" he asked. "If 100% backups were successful, but only 50% of restores worked, that would be a bigger problem."
Tests should determine if backed up data is recoverable, if the right technology is being used and if the performance is sufficient, Harbert said. Tape and disk backup products need to be tested, as well as data deduplication and encryption (if they're being used). He recommended testing new devices at the time of implementation and said that the results of tests should meet application owners' requirements.
Harbert also said that administrators should take a "fire drill approach" to backup testing. They should randomly select applications and associated servers for a recovery test, have a third-party observer help track the process and give no more than three days notice to participants.
"If you can tell me when a test is going to happen, it's not really a test," he said. "You'll tweak things so the test works. That is not the real world."
Dig Deeper on Data storage strategy
Data protection continues to be Job #1 for most storage managers -- a job that only gets more complex as each year companies add up to 50% more storage capacity. The good news is that newer technologies can help in this uphill battle, but you still have to determine the right tools for the job, how they will work within the context of you storage environment and whether they provide the level of protection that your company requires.
Among the topics covered in this track are data deduplication, virtual tape libraries, the newly integrated backup suites, matching data protection levels to business needs and how archiving fits into a data protection scenario.