You might think you have good insight into your infrastructure, but for next-generation data centers, it probably...
isn’t good enough.
If you’re responsible for business-wide IT infrastructure strategy, you’re facing a unique array of innovative, appealing and business-changing technology choices. I can’t recall ever having witnessed a similar number of big picture choices to rival the selection we’re currently seeing from virtualization, automation, the cloud and more. Moreover, nearly everything is better integrated and easier to use. But one element – visibility -- remains an unusual sticking point.
Visibility is the ability to peer into what’s going on in the infrastructure so you can monitor, troubleshoot and prevent problems. It’s been a plague on enterprise IT computing for years, and it may be the single dimension that divides IT disciplines from other engineering disciplines where more rigor can be, and usually is, routinely exercised.
But we don’t just need to improve visibility. We need to go beyond the kind of visibility that simply returns information; we need correlated, meaningful information and an automated way of acting based on that information. With other technologies this is called “instrumentation,” and it’s a system that integrates data across multiple, complex subsystems.
Instrumentation is critical to the next generation of data center technology. Virtualization and cloud initiatives have put IT infrastructure center stage and raised expectations for dependable, reusable and ubiquitous services to a higher level. An infrastructure with significantly more complexity needs to operate with greater precision and reliability.
The type of instrumentation that’s needed is distinctly new, but it’s still an under-recognized technology that’s often lumped together with other tools that really aren’t in the same class. This usually happens with capacity planning or general trending tools.
There are a few key characteristics that can help sort through the available alternatives to figure out what makes a real instrumentation system that reaches beyond mere point products.
Infrastructure encompassing. Instrumentation should encompass a big slice of the infrastructure, be able to relate information across many domains and provide accurate insight into a complex data center. Detailed information provides the foundation for sufficient analysis to inform automated interactions with the infrastructure. Information in the infrastructure can be thought of as a spectrum, from the signal levels on physical wires up to the most complex application and storage interactions. No matter how trivial a single domain of information may seem, missing bands of that spectrum -- a physical layer, Fibre Channel frames, port transactions and so on -- can compromise an instrumentation system.
Efficient and scalable. Given the demands for more insight across more data and with a larger number of systems than ever before, it’s obvious that any instrumentation needs to be highly efficient. Instrumentation should be able to operate at cloud scale, but have minimal or zero impact on the infrastructure.
Real-time capable. Given the dynamic nature of a cloud infrastructure, instrumentation systems must be real-time. Anything less won’t deliver the trending analysis, deeply informed real-time alerting and ongoing observation that will provide a continuous view of the infrastructure.
Cross-domain correlated. Detailed, efficient capture comes with hazards; it can easily create a deluge of information that will thwart efforts to streamline the infrastructure. Consequently, the instrumented private cloud must transform information into something meaningful through deep analytics and cross-domain correlation.
Physically instrumented. Finally, the best way to harvest total information efficiently is with physical layer-attached systems. This may seem counterintuitive, as it’s expected that with a cloud environment the movement is away from the restrictions of the physical infrastructure. But clouds are still built on physical systems, and those systems ultimately provide the largest share of operational data access. Moreover, instrumenting the private cloud with physical layer solutions is possible today.
It’s increasingly clear that data center instrumentation is one of the most important factors in the success or failure of advanced data center infrastructures. While there’s a lot of wishful thinking that the next-generation infrastructure will do miraculous things right out of the box, the hard truth is that IT may never provide an “out-of-the-box” solution. IT must meet a diverse range of fairly unique business needs and, as a consequence, this will always entail engineering and integration.
The market is starting to respond to these needs. BlueStripe, NetApp (with its BalancePoint product, formerly from Akorri), Quest Software, Veeam and Virtual Instruments all offer products that are on the instrumentation track, and there are others out there, too. Check them out and see how they stack up against the criteria we’ve identified. Whatever you decide, it’s important you take that first step. The alternative could be a catastrophic failure.
BIO: Jeff Boles is a senior analyst at Taneja Group.
Dig Deeper on Data management tools