Published: 12 May 2006
ILM isn't just tiered storage
To realize the true benefits of tiered storage--and to take a big step toward ILM--you have to align data with its business value.
IT'S NOW THREE-PLUS years into the information lifecycle management (ILM) "movement" and time to take stock of its evolution. Back in its early days, every major storage vendor's Web site boldly trumpeted ILM, complete with photos of smiling people ostensibly enjoying its benefits. ILM can still be found on those Web sites today, but it's promoted far less prominently.
A fair observation is that the initial enthusiasm over ILM has been scaled back to more pragmatic levels. As a result, the focus of discussion has shifted to the realm of tiered storage, a far more modest and realistic goal. Tiered storage could be thought of as ILM lite or, for the perpetually optimistic, the first step toward "real" ILM. You may recall that ILM demands storage alignment with data value and regular re-alignment based on the changing value of data as it ages--a challenging proposition. The tiered storage approach, on the other hand, requires only an initial alignment, possibly supplemented with data archiving.
It's not an unreasonable approach. We've encountered situations where data has been allocated to top-tier storage based on a one-size-fits-all policy without considering its actual value. But data growth led to serious inefficiencies. With 90% of data on the most expensive platforms, the tiered-storage concept is an easy sell. The pressure to cut or contain spending is considerable, and buying storage at half the price of what you're currently paying is a no-brainer.
Many of you agree apparently, and storage vendors have been selling ever-increasing quantities of second- and third-tier storage. But in many cases, the expected cost savings haven't materialized. Why? And what can be done to ensure that the value of these investments is realized?
At the risk of stating the obvious, the key to realizing the benefits of tiered storage is being able to move and maintain data within the tiers. This means aligning data and storage according to business value. It requires doing some things outside the comfort zone for many storage managers, such as:
- Building a business case that demonstrates true cost savings without introducing risk to the organization.
- Developing an easy to implement and, more importantly, easy to maintain methodology for classifying data and applications.
If it ain't broke ...
Users may resist a tiered storage infrastructure. Storage admins may be enthusiastic about the new tiers of storage and the possible infrastructure improvements, but to application owners, new storage means change and change means risk. If an app has been running on tier-one storage for years, why would a user want to move it? The good corporate citizenship argument can go only so far, and the outages required for data migration, along with the ensuing risks of disruption and possible performance degradation, are deal killers. Therefore, their attitude is likely to be "If it ain't broke, don't break it!"
Another consideration is the increased complexity introduced by having to manage a new set of devices. Unfortunately, common storage management tools are still not available, so it's necessary to manage each technology to some degree with its own set of tools. Understaffed storage teams may be taxed by the demands of developing sufficient skill levels to effectively leverage the new technology.
The operational cost impact of additional tiers isn't often properly factored in. Of the various components making up total cost of ownership, ongoing management and operations can dwarf the capital component. So, hardware savings might be erased by increased operational expenses. At the very least, the hardware cost delta between tiers must be substantial enough to make it worth the additional complexity. If not, you're spinning your wheels.
A successful business case for tiered storage needs to consider user acceptance, operational impact and significant, achievable cost savings. Too often, the tiered storage decision is predicated on a Field of Dreams mentality--if you build it, they will come. The truth is that revenue-producing business units often trump IT, a functional cost center. While some IT organizations may be able to dictate standards, in many cases they're the ones who are dictated to and therefore need to provide a more compelling business justification.
Of classes and tiers
To most storage administrators the concept of tiered storage centers on establishing multiple hardware platforms. For example: tier one may be an enterprise array, tier two a midrange system with Fibre Channel drives, and the same midrange array with SATA drives may serve as tier three. Essentially, the approach is as follows:
This focus on hardware is a mistake. Users don't care what storage platform an application is running on, but they do care about the level of service they're receiving. I realize this isn't always the case in the real world, and that users often dictate their preferred platforms. But this is largely because they've been conditioned to think and speak "hardware" to their infrastructure teams. This restricts IT and obstructs the development of a business-aligned, cost-effective storage infrastructure. It's how we got into the "90% of data on tier one" mess in the first place.
Instead, the discussion should be based on an application's required data management attributes. The goal is to establish classes of service rather than tiers of storage--a subtle, but critical, distinction. It shifts the conversation away from hardware to business needs and alignment. As shown in the next chart, the focus is on defining business requirements that can then lead to an application or data classification framework. This, in turn, leads to the definition of service levels and the creation of a service catalog. Then the technology to support each service level is selected. These services may--or may not--require multiple tiers of storage. In many cases, service-level distinctions will be based on applying different features of a single hardware platform, for example, RAID 5 vs. RAID 10, frequency of split mirrors, or synchronous vs. asynchronous vs. no replication.
Building the classification framework is the key to moving toward an aligned infrastructure. Classification can be done at the application or data level, although it's most practical to start at the application level. There are several different dimensions of classification that must be considered. Each dimension becomes a classification category, as shown in "A data classification framework".
Some environments are using a phased approach to application classification and opportunistically leveraging other major initiatives--for example, focusing on data retention as part of an overall archiving initiative, or targeting performance and resilience in conjunction with equipment lease expirations and a technical refresh cycle. These efforts include developing the overall classification framework and approach, as well as the process and tools needed to support and scale the effort on an ongoing basis. One GlassHouse client initially focusing on data retention, for example, is creating an extensible methodology that includes the following:
- A cross-functional process flow, including phases, activities and dependencies involving infrastructure, app teams and functional areas such as legal, finance and compliance.
- Classification rule sets to establish the logical link between business requirements and defined service levels.
- Requirements questionnaires to gather application-specific business requirements.
- A comprehensive data classification repository built upon a relational database to store and manage business requirements, application information, infrastructure data and service-level information.
Tiered storage initiatives have largely been acts of faith based on the belief that buying cheaper storage would help stem rising costs. Achieving this goal isn't so easy. The only path to success is to clearly provide value. To do this, storage services must be aligned with business needs. The unavoidable point of intersection is application data classification.
- Grundon Waste Management Reclaims Data Management Control with Software-defined... –DataCore Software Corporation
- Why Microsoft Azure is Choosing Tape –Fujifilm Recording Media USA, Inc.
- Computer Weekly – 9 October 2018: NHS app marks a new era for UK health service –ComputerWeekly.com
- Benefits of Running VDI on a Hyperconverged Platform –Pivot3