SRM tools struggle to meet today's demands

At one time, storage resource management (SRM) applications tried to be all things for all storage shops, with little success. Modern data storage environments require new tools to navigate the intricacies of virtualized environments, but the jury's still out on whether storage management vendors can meet those needs.

This Content Component encountered an error
This article can also be found in the Premium Editorial Download: Storage magazine: Primary storage dishes up dedupe:

At one time, SRM applications tried to be all things for all storage shops, with little success. Modern data storage environments require new tools to navigate the intricacies of virtualized environments, but the jury's still out on whether storage management vendors can meet those needs.

The information storage managers want most about their data storage systems probably hasn't changed since storage arrays began appearing in the enterprise. These days, however, storage managers need even more than that if they expect to run an efficient storage operation that addresses business needs as well as storage.

"The basics haven't changed: how much storage do I have, how much is still available and what is the utilization," said Bob Laliberte, senior analyst at Milford, Mass.-based Enterprise Strategy Group (ESG). But Laliberte doesn't stop there. Storage managers also want to look across multiple storage tiers and arrays from different vendors.

"Of course, we look at the usual metrics to make sure we have enough storage capacity," said a storage manager at a large financial trading firm. However, he wants more than that from his storage resource management (SRM) tool, Aptare Inc.'s StorageConsole. "We have customers who do a lot of algorithmic trading, which entails many more orders per trade. That means there's more data to be stored for each trade," he explained. The company uses the Aptare application to track the storage activity for chargeback purposes.

Chargeback is just one of the new metrics storage managers want from SRM tools. While they're chasing down the basic capacity data, they might as well collect storage-area network (SAN) configuration data, information on network paths, and data on backup success/failure rates with their SRM tool. Others see the need to extend SRM even further, with storage and SAN performance metrics, and visibility into the virtualized environment. "We also see SRM providing real-time performance metrics," said Jeff Boles, senior analyst and director, validation services at Hopkinton, Mass.-based Taneja Group. In short, storage resource management has to be about more than just available capacity.

Changing SRM landscape

The rise of players like Aptare signals a change in the SRM landscape. "The big SRM players of the past are now littered along the side of the road . . . maybe a few have survived," said Greg Schulz, founder and senior analyst at StorageIO Group, Stillwater, Minn. IBM remains with its various Tivoli tools, while Hewlett-Packard (HP) Co. now has Storage Essentials, into which it folded its AppIQ acquisition. EMC Corp. includes SRM in its Ionix family of tools.

Many other management software vendors have been absorbed by bigger players. Onaro Inc., which provided visibility into the SAN, was acquired by NetApp. Tek-Tools Software, a popular small SRM player, was acquired by SolarWinds, and Quest Software Inc. picked up MonoSphere Inc. and its Storage Horizon capacity management tool in 2009. Similarly, IBM acquired NovusCG in 2007, while Opsware acquired CreekPath Systems in 2006 and was then acquired by HP the following year.

Yet fresh players continue to arrive on the scene, Schulz noted. For example, SANpulse Technologies Inc., which isn't exactly a startup, provides analysis and correlation.

"Overall, we're seeing a shift from the large monolithic, costly SRM tools to lighter, easier to use, more nimble tools," Schulz said. The old SRM tools typically required extensive customization and took upwards of a year to deploy before a company would start seeing real value. "Today's smaller, lighter tools are relatively cheap and fast to install. The users get value almost immediately," he added.

Aptare CEO Rick Clark refers to the earlier storage resource management players as SRM 1.0. He dubs new players, like his company, as SRM 2.0.

In addition to their high cost and big footprint, the older SRM products also relied heavily on agent technology. This not only slowed deployment, but complicated management as each agent became a potential problem and expense. The older products also often required a fat client on the storage administrator's desk which, at a minimum, might have to run one or a few Java applets. Agents facilitate the collection of detailed information from the storage arrays of different vendors.

SRM 2.0 tools, by comparison, are generally agentless. If they're not agentless, "they find creative ways to get around the use of agents by substituting a more friendly kind of agent," StorageIO Group's Schulz said. This often takes the form of an external appliance that knows how to dig out the desired information from each attached storage device.

"Agentless SRM approaches certainly are a lot easier," said Steve Scully, research manager, continuity, disaster recovery (DR) and storage orchestration at Framingham, Mass.-based IDC. "Previously, companies had to commit to installing lots of agents. In effect, you were adding to the problem in your attempt to fix the problem."

Despite the tumult in the storage resource management segment over the past few years, IDC's ranking of top SRM vendors hasn't changed much. "EMC is the big gorilla in the space with Ionix," Scully said. It's followed by IBM with its TotalStorage Productivity Center, CA's Storage Resource Manager and HP's Storage Essentials products.

The challenge for all of these tools is multivendor information collection. SNIA's Storage Management Initiative Specification (SMI-S) was supposed to solve the multivendor challenge, but it hasn't fully delivered (see "What's happening with SMI-S," below). "The problem with SMI-S is that it's lowest common denominator," StorageIO Group's Schulz said. "Vendors can make extensions, but to really make it work you need to use those or else go directly to the vendor's API." The need for extensions or APIs misses the point of having a management information standard.

What's happening with SMI-S

SNIA's Storage Management Initiative Specification (SMI-S) is alive and well. Once touted as bringing storage management nirvana by which storage managers could collect, consolidate and analyze data from every storage device in their environment regardless of vendor, SNIA today has more realistic expectations for the standard. Rather than replace device-specific tools, SMI-S enables a base level of information that storage vendors can extend as desired.

Officially, SMI-S defines a method for the interoperable management of a heterogeneous storage network and presents the information to a Web-based enterprise management (WBEM) client from an SMI-S-compliant common information model (CIM) server and an object-oriented, XML-based messaging interface designed to support the specific requirements of managing devices in and through SANs. In plain English, it defines a way for standards-compatible tools to get at sets of common information from SMI-S-compliant storage devices.

Interoperability testing has already begun on SMI-S Version 1.4. However, SMI-S only takes you so far; data storage administrators will still need device-specific tools and vendor APIs and CLIs to perform certain tasks. For most routine tasks -- approximately 80% of what an administrator does day to day -- SMI-S alone should be sufficient, according to Wayne Adams, senior technologist and director of standards within the Office of the CTO at EMC Corp., as well as SNIA's chairman.

Given the disarray in the storage resource management market, many users end up with multiple tools to handle SRM issues. For example, Gorilla Nation, an Evolve media company headquartered in Los Angeles, uses SRM tools from NetApp to handle SRM on its filers and another proprietary SRM tool built into its ParaScale Inc. internal storage cloud. "No [single] SRM tool looks across everything or gives you all the information you want," said Alex Godelman, Gorilla's senior vice president of technology. The company handles online advertising sales for several hundred websites, as well as dozens of its own websites.

Many users simply skip using SRM tools and make do with whatever management tools their storage array vendor provides. LifeScript.com, a woman's health Web portal based in Mission Viejo, Calif., uses 3PAR storage in its highly virtualized server environment. "We're very fluid. We might put up a server one day and allocate storage to it and then scrap it the next," said Gary Rizo, director of IT operations. For storage management data, the company draws on 3PAR's reporting tools. "We want to know when we're running out of storage, so we use the 3PAR System Reporter to configure alerts when we approach a threshold," Rizo explained.

New SRM metrics

Storage administrators still need the basic metrics around available storage. "They want to get a handle on capacity; what's available and when they're going to run out," IDC's Scully said. In the decade or more since SRM appeared, however, the IT infrastructure has changed, growing far more complex. This complicates the storage resource management challenge.

Virtualization, for example, creates problems for storage admins. "You lose visibility," Scully noted. "You no longer see a one-to-one server-storage relationship." Now storage admins need automated discovery and path mapping from the application and the virtual machine to the physical server, through the network switch to the storage array and down to the LUN and file system. Throw all of that into a multivendor environment and the complications snowball.

Today's heightened focus on governance, risk and compliance (GRC) also puts new demands on storage resource management. "You need the SRM tool to track storage usage against policies," Scully said.

Still, Scully hasn't sensed any clamor at the storage admin level for new SRM metrics. "Many are too busy just trying to keep up to think about new metrics. This is really a higher-level IT concern," he said, that shows up at the data center level.

The new SRM metrics fall into three broad categories: performance metrics, backup and disaster recovery metrics, and green metrics (see "New SRM metrics," below). The "green" category is only just emerging.

New SRM metrics

These new storage metrics, based on input from storage administrators and analysts, should find their way into future storage management products and practices:

  • Cost per IOPS
  • Kilowatt per IOPS
  • Kilowatt per unit of bandwidth
  • Kilowatt per TB
  • Backup errors per night/week
  • Speed to recover disk/file/server
  • Storage usage per user/application/business unit
  • Storage usage by tier/HBA/LUN
  • Storage compliance with SLA

Performance metrics
"In terms of performance, I want to see metrics that show choke points by measuring throughput on the array, the cache, by RAID group or the switch," said the trading firm's manager. Aptare gives him everything he needs in terms of capacity management and chargeback. To get visibility into potential choke points along the end-to-end storage process, he turns to tools provided by his array and switch vendors.

For the Law School Admission Council Inc. in Newtown, Pa., a service provider to its member law schools, performance management requires keeping close tabs on available capacity, especially during its busy season. Using HP's Storage Essentials, the school monitors changes in capacity. "We try to be proactive," said Jerry Goldman, director of technical services. "We can add capacity on the fly or clear out data." He said that when a capacity threshold hits 85% it gets checked every six hours; when it reaches 90%, the warnings come every three hours. If it hits 95%, the storage people are on it hourly.

Tom Becchetti, the Unix and storage manager at a midsized medical research institution, has lately started paying more attention to performance. "Now that everything is virtualized, it's driving higher I/O. So I'm concerned about response time in IOPS," he said. Their usual SRM tools don't show end-to-end performance, so he's looking for tools that operate at the subsystem level.

"A lot of SRM tools are missing performance metrics, especially around IOPS," StorageIO Group's Schulz added.

Backup and disaster recovery metrics

For some IT managers, backup and disaster recovery metrics have become important. "I'd like to see metrics on how quickly we can recover a website. If I have a certain amount of storage, how long will it take me to recover in the event of a failure?" LifeScript.com's Rizo queried.

The financial trading firm is also concerned with backup and recovery metrics. "We're an ITIL [Information Technology Infrastructure Library] shop and have to meet specific backup and recovery service targets," said their storage manager. Aptare gives him reports on successful backups, which he can massage for trending.

While companies are looking at backup and disaster recovery metrics, they should also look at replication performance and success/failure. "The same SRM tool should report on replication errors and what kind of errors they were," StorageIO Group's Schulz said.

Green metrics

Green storage resource management metrics have barely begun to take hold. "No customers are asking us for green storage metrics. There are not yet tools to expose that data," said Kalyan Ramanathan, HP's director of business service management. Aptare's Clark added: "Most storage components don't give power consumption. Even their APIs and CLIs don't give that information." Once storage vendors can expose that data, SRM tools will be able to grab it.

But some storage managers see potential benefits from gathering green metrics. "Take something like kilowatts/TB. Changes in a number like that could be an early predictor of a disk problem," said John Wonder, an independent storage consultant in San Mateo, Calif. Other storage-related energy metrics, such as energy cost per TB, he suggested, might be valuable in a large data center or in a location where energy availability is severely constrained.

What's in a name?

Given the emerging interest in new storage metrics and the growing use of virtualization, the storage resource management label is perhaps out of date. With interest in wide and deep end-to-end visibility, StorageIO Group's Schulz suggested that "SSRM" -- systems and storage resource management -- would be a more appropriate term in this virtualized IT era. Trying to monitor and measure storage performance without looking at switch ports and servers, both virtualized and physical, will be increasingly frustrating.

BIO: Alan Radding is a frequent contributor to TechTarget sites.

This was first published in April 2010

Dig deeper on Data management tools

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchSolidStateStorage

SearchVirtualStorage

SearchCloudStorage

SearchDisasterRecovery

SearchDataBackup

Close