Not according to Marc Staimer, president of Dragon Slayer Consulting. "There is no way in hell a database can saturate a SAN in any way, shape, or form," he declares.
Staimer notes that while databases can demand a large volume of data, the latency is small and databases are usually spread across multiple platters so seek time can be minimized. Furthermore, database administrators can deploy additional tricks of the trade to improve performance -- implementing hot files, for example. For those who think they have a database performance problem, he advises an investment in caching and lots of solid state memory. Similarly, says Staimer, NAS setups have no problems with most databases. However, he adds, Oracle may be an important exception "because it has about 300 commands when it writes to block storage," which adds a lot of processing overhead.
Nancy Hurley, an analyst with Enterprise Storage Group, takes a similarly upbeat view of networked storage and databases. She says the only time a database would present a real performance challenge is if it has been allowed to grow out of control. In this scenario, increased search time could bog down the I/O. Those facing performance problems from that source could consider solutions such as those available from Princeton Softech (their Active Archive Solutions enable companies to manage and store data, based on its business value using an information lifecycle management approach)
Likewise, Hurley gives her endorsement to NAS for databases though she cautions that very low-end NAS might not be robust enough. The bottom line, she says, is to detemine what how long a latency your application can handle.
For more information:
About the author: Alan Earls is a freelance writer in Franklin, Mass.
This was first published in June 2004