I have a highly mixed environment with both Sun Solaris, Windows Nt and Windows 2000, AS400 etc attached to a Cisco san and EMC disk arrays. I am trying to set up a set of max performance threshold with regards to the whole environment. However information in this regard seems scarce. Is there any advise you can give as to websites or other information sources to cover this. I particularly need information on the performance of EMC disk arrays and storage area networks.
I certainly agree with you that information on performance numbers is rare. But, I believe that the reason tends to be the definition of the term performance itself. Is performance the speed at which data is transferred? Is it the reliability of a system? Is it how good a system is in delivering its "deliverables" and, if so, what are the deliverables?
Given performance is meant to be the speed at which data is transferred to its destination, what benefit do we suppose to be attached to that speed? What conclusions do we draw out of a speed threshold? High speed is good and low speed is bad? I don't want to become to philosophical on this, but, we also have to ask if we are measuring the right things and asking the right questions.
If we look at what parameters influence the time it takes for a piece of data to be transferred to the physical storage, we basically find the following devices along the data path within a SAN:
This is simply to say that there are many devices and possible bottlenecks to take into account. If you were to measure the throughput of the physical storage you could come up with huge differences depending on what data you have. The throughput of small files is certainly much less than it is for big ones, simply because the read/write heads have to be adjusted more often if small files are written. And writing data takes always a bit more time than reading it. So, even if you find performance data or benchmarks you should be very careful in interpreting them. Benchmarks can give you an idea about the level of performance that particular device is operating in but you must also understand what the test environment is since it's so important to the outcome. For this reason, you can seldom compare two benchmarks easily.
There is a rule of thumb that says, that a 10% difference in benchmarked performance is no difference at all. Unless the performance of two devices do not differ at least 10% they should be treated as equal performers. Needless to say this is not a favorite of those manufacturing the devices, but as a practical hint it is very useful.
Editor's note: Do you agree with this expert's response? If you have more to share, post it in one of our .bphAaR2qhqA^0@/searchstorage>discussion forums.
This was first published in October 2003