This article can also be found in the Premium Editorial Download "Storage magazine: Tips for lowering the cost of storage support contracts."
Download it now to read this article plus other related content.
We're hung up on an outdated computing model that makes everything tougher.
We're at one of those rare, very powerful inflection points in our crazy storage universe that's so big it can't be ignored, but came so slowly that nobody seemed to notice it until now.
For 50 years in commercial computing, dynamic, transactional data is what moved computing out of science labs and into the commercial world. Trillions of dollars have been spent trying to harness the power of that data and provide the infrastructure to create, store, protect, move and manipulate that information. The problem is that most data isn't transactional anymore; a huge portion of the issues in storage and throughout our infrastructure are caused by us trying to use the same systems, architectures and methodologies we're used to for data that has new requirements.
Commercial computing was built on the application of technology to support business causes--to create competitive advantages by enabling users to work more efficiently. That led to greater profitability and faster decision-making based on having more data analysis. The competitive advantage eroded when everyone had computers, and went from being strategic to being tactical and defensive. The game became about how many transactions you could support concurrently, and the value of each transaction minus the cost of performing those transactions equaled the "hard dollar" profit.
The industry built
Commercial computing was dominated by folks who built the right stuff at the right time and sold it most effectively. IBM mainframes are still the transactional systems of choice.
Oracle is still the dominant DBMS. EMC is still the core storage device. Hewlett-Packard and Sun Microsystems cashed in on the rush to distributed computing. Smaller servers meant smaller decentralized storage, opening the door to the likes of Network Appliance, which created and continues to dominate the file-serving world created by the distributed computing evolution. Everyone and their brother came out with a block array for the "open" world.
What that means is that we now find ourselves with massive amounts of things. The number of devices we have to manage in complex networks rivals the human nervous system. And it's compounded by our applying the same basic operational and technological constructs to this world as we did with the monolithic systems of old.
Industry used a new market wave by dumbing down and cheapening the heavy-duty stuff you'd already bought. They took systems built for transactional business, ripped stuff out to make them cheaper and then sold them as the answer for the "new world."
We now have 100 times the specialization we did in the centralized world because we have 1,000 times the amount of things to know. We do this because we haven't noticed that we're solving the wrong problems the wrong way ... and if we don't stop, it's going to kill us. Next month, I'll explain the right problem, which will get you looking for the right answers.
This was first published in May 2007