Despite years of constrained budgets and limited buying, most organizations believe they can mine more value out...
of their existing storage assets. Disparate storage area network (SAN) islands and multiple servers--combined with limited reporting capabilities--reinforce this belief. The IT management mandate is clear: Before adding more capacity, get more mileage out of what's already on the floor.
A storage resource management (SRM) tool can identify the capacity you have, control how it's being used and forecast how much more storage you will require. Any SRM tool will collect data on things such as file aging, disk and network performance as well as customer requirements such as application performance expectations. Correlating that information helps determine what pain points to address first. Most organizations will find that their storage infrastructures aren't optimized and that their resources aren't shared.
Keep in mind, however, that all enterprise SRM tools are still developing. Most work best at the departmental level or in geographic locations, although an increasing number now offer the ability to roll data up to provide enterprise reporting. But there's a big difference between enterprise reporting and enterprise management, and being able to do both from the same central console is not here yet, at least, for most products.
|Five Steps Toward Capacity Management|
Organizations must first decide what capacity they want to measure. That decision will largely determine what tool or set of tools to choose. A tool should feature:
- Disk utilization reporting
- File-level reporting
- Database reporting
- Visualization of the storage network infrastructure
- Storage array reporting
Tools like Fujitsu Softek's Storage Manager and Tek-Tool's Storage Profiler for OS provide one small, easy-to-deploy agent that meets the first two criteria. Their agents can be installed and configured in just a few minutes, consume minimal disk space (maximum 30MB) and remain dormant except during scheduled collection times. They measure disk utilization filtered by such criteria as servers, users and group. For file-level reporting, the agents provide more granularity, enabling users to identify certain file types such as mpg, jpg and xls. They also determine the age of the file types and see historical trending and forecasting information that illustrates how fast these file types are growing while predicting how fast they may grow.
Similar tools, from EMC, for example, aren't as simple to install and configure because of the way they are architected. EMC currently offers two tools, one called VisualSRM, which provides an easier to install point solution but gathers less information than their flagship product, Control Center (CC), which requires the deployment of a 100+MB agent and licensing for four products: Control Center, StorageScope, StorageScope FLR and Automated Resource Manager ARM. The CC agent can consume anywhere from 1% to 15% of a host's CPU while it performs storage capacity monitoring and reporting functions.
Some companies like AppIQ collect device-level and host-level capacity information without an agent. Its StorageAuthority Suite software--written to the Storage Networking Industry Association's Storage Management Initiative Specification (SMI-S) and Distributed Management Task Force's Common Information Model (CIM) standards--gathers this information simply by querying SMI-S compliant operating systems. The query may take place over either a Fibre Channel (FC) fabric or an Ethernet network. The operating system vendor must provide or ship a CIM Object Manager (CIMON)--such as Microsoft Windows WMI--and devices must offer SMI-S interfaces.
SMI-S compliance should no longer be considered optional when making new storage purchases. While users should remain skeptical about any vendor's claims that their capacity management tool can manage their storage infrastructure through only the SMI-S standard, they can certainly expect these products to deliver fundamental reporting capabilities. Any capacity management product that doesn't have an SMI-S road map shouldn't be viewed as a viable long-term solution.
Gaining visibility into databases beyond their raw size requires the deployment of database-specific agents. Most databases require a reporting agent that's specifically tailored for the database; for example, an Oracle database requires the deployment of an Oracle agent. Other products such as Storability's Global Storage Manager (GSM) require the deployment of two agents--a core database agent and one specific to the database being monitored.
|What to look for in a capacity management tool|
In addition to database reporting, most vendors offer tools that report on what switch ports are free and used, what volumes on storage arrays are allocated and unallocated, and how much traffic is traversing the SAN. How well each product does this depends on how it is architected, and if the vendor integrates this function into their overall reporting.
The products that best integrate these reports have the largest agents and invariably the most complicated installs. However, this is changing as newer, better-designed products with small agents can get the world wide name (WWN) off of a server's host bus adapter (HBA) and with only that information, can identify that WWN's path through the SAN and any and all of the storage assigned to that WWN on any storage array. Older products use more cumbersome methods to gather this same information.
For instance, products such as CreekPath System's Storage Operations Manager, EMC's Control Center and Veritas' SANPoint Control collect this information by putting agents on every server attached to the storage network. Their agents perform active monitoring on the servers on which they reside, querying the server and SAN for updated information based upon parameters set up by the user. Because these agents remain active on the server, users need to determine how current they want their information to be--how often the agents should run. Additionally, they need to consider the time it will take to configure the agents of these products and whether they will realistically be able to use all of the information that these products provide. Better to look for products whose agents can be deployed and configured in less than two minutes. Anything longer than two minutes is not worth the effort.
Tape utilization and library monitoring remains below most users' radar screens. Products like Storability's GSM do give visibility into StorageTek's tape environments and most tape vendors provide native utilities to measure and report on library capacity. Yet with the increasing use of ATA disk backup, the relatively low cost of tape and the scarcity and proprietary nature of tools that manage tape, users should concentrate on getting disk under control before worrying about tape.
Measuring the assets
The time it takes to measure storage assets will depend on the tool selected, the size of the environment, what's being measured and the number of agents deployed. All of these factors will contribute to the quantity, quality and level of detail gathered on the environment. Users should temper their reporting expectations in accordance with how long the tool has been implemented. Depending on the components deployed, expect initial reports to provide statistics on storage utilization, the sizes and types of files on individual servers, available ports on FC switches and allocated and unallocated storage on each storage array.
These usage reports can have immediate impact on the bottom line. It will help organizations determine if they need to purchase more storage arrays or switch ports, if they have enough available to meet current and forecasted needs or if they are buying the right kind of storage. For instance, the reports may confirm that the organization may need to purchase more storage. However, it may also reveal that the storage needed doesn't need to be an expensive monolithic array, but a more affordable modular array because the data is primarily reference data.
After the tool is up and running for a few months, expect more comprehensive enterprise reports. Storage utilization may now be grouped by such criteria as application, operating system, location or department.
Making information life cycle management (ILM) a reality requires intervention. Actively managing your storage to achieve an effective ILM program should be an important consideration in choosing a capacity management tool. Right now, only the best managed environments can report on and measure storage capacity but they remain hard-pressed to redeploy storage assets quickly. Next-generation technologies such as newer storage arrays, iSCSI SANs and network-based file- and block-based virtualization, along with host-based replication software will help organizations realize the benefits of a centrally managed storage utility.
Dig Deeper on Data management tools