When puzzling out the issues of storage, it is wise to do a reality check on what the industry is offering in the...
way of 'solutions' and how their 'products' map to your requirements. This is often a more challenging task than you would think since one, the industry isn't necessarily forthcoming with details -- particularly performance and cost of ownership data -- to help inform your product choices; and two, because consumers often have no idea of what their applications truly require from infrastructure -- mainly, because they have never checked.
I've picked on vendors a lot lately, so let me briefly pick on you, the user. Understand that, ultimately, whatever storage products you deploy, your choices reflect on your professional judgment, and you must own the blame when things don't work as planned. So, please take my comments in the spirit in which they are offered.
To set the stage, let me tell you a true story. I recently participated in a conference call with a large manufacturer whose CIO wanted me to chat with his infrastructure management team for reasons that became evident to me later. They were preparing to roll out a Fibre Channel (FC) fabric in each of their 20 production facilities.
I asked them whether they had assessed their application requirements before settling on an FC SAN as the storage platform. They had not. They figured (and their vendor encouraged this thinking) that it would be better to build the infrastructure, then fit the applications into it.
"An interesting approach," I said aloud, but 'bass-ackwards,' I thought to myself.
I flashed back on the first day of my first class in information technology many years ago (back when it was still called "data processing"). The instructor said, "First, you must understand what your application software requires, then you must provide for it in your technology architecture." Those remain words that I live by.
So, just to be nosy, I asked, "What are your applications?"
They told me that in each factory, the app was the same: a Microsoft SQL Server database.
"How much data are they generating?" I asked.
"About 360 gigabytes (GB)," they responded.
"Is that per day, per week?" I asked.
"No," they responded. "That's 360 GB total."
I thought about saying something like, "Hey guys, did you know that you can store 360 GB on a single disk drive these days?" Instead, I asked, "Is this a frequently-updated data set? I mean, are you modifying it on the order of hundreds or thousands of transactions per second?"
"No," they said. "It is fairly static data."
"Oh, and we delete it every 24-48 hours in its entirety," one fellow offered cheerfully.
"So, you are deploying 20 FC fabrics to support 360 GB of data in each site that you delete every day or two," I repeated to make certain that I was hearing correctly.
I was. I thought about saying something smart-alecky, but I kept my tongue. "So, do you have really good, highly-trained cadre of FC-savvy tech support personnel in each factory to manage your FC fabric?"
They hemmed and hawed, like the krewe from Monty Python in one of their sketches on the BBC or PBS. The final answer was "No."
"I see," I said, my teeth grinding. "So, do you have a storage management utility or something that gives you visibility into all the fabrics from, say, corporate headquarters? Tools that will let you be proactive from afar about burgeoning equipment failures and the like?"
More Monty Python-esque chatter, then, "Not really. No."
Seems that their vendor had told them that the fabrics were a good idea. Strategic, you know? "And, nobody gets fired for buying EMC," I said, completing their reasoning in my head.
The moral of this story is obvious, so I won't belabor it. But, in part two of this post, I will offer a few words of advice for preventing 'bass-ackwards' storage in your environment. Please stay tuned.
About the author: Jon William Toigo is a Managing Partner for Toigo Productions. Jon has over 20 years of experience in IT and storage.
Dig Deeper on SAN technology and arrays