Note from the editor: A software architecture designer at a travel services industry recently wrote our storage management expert Brett Cooper to ask for his help in solving the performance bottlenecks in a CPU-bound application. The problem, he explained, was that the application tended to bog down under two conditions: 1) immediately after a database update, or 2) whenever his team implemented a new version of the application. Might a solid state drive provide faster access and allow more concurrent requests for data? Would switching to Fibre Channel help?
The challenge in answering your question lies in comprehending the footprint of your application's performance requirements and applying it to the specific transport and protocol. Performance is usually one aspect in making an intelligent storage decision. Have you ever measured the performance of your application environment, storage, host, etc.? What you may find is that it isn't the storage subsystem that is the limiting factor, rather the application on the host is really throttling the host processors, or memory and an upgrade to the system is required to push the kind of throughput and performance that the application requires. Other criteria include cost and manageability.
Cost: Performance can sometimes come at a high cost. Remember that you are trying to attain a performance footprint, and that getting there may require you to pay additional dollars on the upfront hardware. Also, understand that the cost of the hardware is just one upfront (depreciated) cost; there is also the upkeep and management, which, in the end, can turn out to cost many times what the physical storage costs. Also include the cost of the full lifecycle of the solution, including backup and recovery, mirroring and replication and high-availability capabilities, all of which are important in order to understand the total cost of ownership.
Manageability: If your company is OK with managing the network storage, what would moving to a Fibre Channel setup involve? Retraining the group, purchasing new hardware and software management tools? Possibly outsourcing the management to a third party? Also, which policies and procedures will need to be updated to accommodate a new storage platform? How about on the hosts? I suggest meeting with the end users and understanding their requirements as well, not just looking at the one criteria of performance, which leads to how the solution integrates into the current environment.
Some more database background: Most databases write in 8 K data packets, and, depending on the data write and read (random or sequential) patterns, NFS, iSCSI and/or Fibre Channel could each meet the needs quite well. The challenge in selecting the best performing protocol isn't the physical speed limitations of the wire, but understanding the requirements of the application, and then creating a requirements document that can be reviewed by the team and shared with the storage vendors so each can respond with their best solution. Once you receive the proposals, go through them and pick the top two or three and invite them into your shop to prototype the environment and prove their solutions to your challenges. Also, make sure that the vendor can produce real customer references for you to talk with and discuss the solution. The best solution is one that is already in production, so you can learn from them and get advice as to what works, what doesn't, and what is coming down the road in terms of the total solution.
The vendor that bests demonstrates their ability to solve your challenge is the clear winner. Try to avoid politics in making the decision. Focus on the costs and the capabilities of the solution to best suit your needs and the needs of your company. You may be betting your happiness and, in the end, your job on the solution.
For more information:
SAN School: What makes a SAN go
About the author: Brett Cooper is a Technical Marketing Engineer at Network Appliance, Inc., as well as SearchStorage.com's storage management/best practices expert.