This article can also be found in the Premium Editorial Download "Storage magazine: Solid-state storage guide."
Download it now to read this article plus other related content.
You might think you have good insight into your infrastructure, but for next-generation data centers, it probably isn’t good enough.
If you’re responsible for business-wide IT infrastructure strategy, you’re facing a unique array of innovative, appealing and business-changing technology choices. I can’t recall ever having witnessed a similar number of big picture choices to rival the selection we’re currently seeing from virtualization, automation, the cloud and more. Moreover, nearly everything is better integrated and easier to use. But one element – visibility -- remains an unusual sticking point.
Visibility is the ability to peer into what’s going on in the infrastructure so you can monitor, troubleshoot and prevent problems. It’s been a plague on enterprise IT computing for years, and it may be the single dimension that divides IT disciplines from other engineering disciplines where more rigor can be, and usually is, routinely exercised.
But we don’t just need to improve visibility. We need to go beyond the kind of visibility that simply returns information; we need correlated, meaningful information and an automated way of acting based on that information. With other technologies this is called “instrumentation,” and it’s a system that integrates data across multiple, complex subsystems.
This was first published in March 2012