We are searching for a storage solution for our Web/application environment consisting of three servers (test, production, data warehouse) and for our DB environment (test DB server, production DB server, data warehouse DB server).
Following is additional information:
- OS is Solaris 8
- DB is Oracle 9i
- The production Web/application server, production DB server, data warehouse
- Web/app and data warehouse DB servers are high traffic servers. The test and the test DB servers are medium traffic servers.
I'm sorry if I haven't provided you with enough relevant information. I'm rather new at this. Thanks very much for your help.
Ok. The best method of storage attach in the described environment would be NAS for the Web servers and SAN for the DB and application servers. If your business logic resides on the Web end, you can use a pool of SAN based storage for a NAS head to share out data to the Web farm. If the Web servers build dynamic pages through the application servers to the DB servers, then you can use internal disks on the Web servers and connect the production and test application and DB servers to the SAN.
I assume the test servers will be used as swing boxes into production when maintenance needs to be performed on production servers. If not, buy yourself another set of servers for maintenance. The test and data warehouse servers can get almost immediate updates by using storage based cloning. You can clone the production data and auto mount it to the data warehouse server, and do the same for the test environment so testing can be done on current information.
If your application is mission critical, I would recommend a clustered back-end for the database servers using Oracle as the cluster engine. This would also eliminate the need for a maintenance server as the DB can be moved between cluster members.
If you are using NT for the Web front end, then you can use the WLB (Windows Load Balancing) service to scale that tier. It works much like a BigIP load balancer except it is software based and cheaper.
Backup should be done using a shared library connected to the SAN fabric layer where serverless backup would be supported. This will help keep the application up 24x7, and not impact production during backup hours.
Editor's note: Do you agree with this expert's response? If you have more to share, post it in one of our .bphAaR2qhqA^0@/searchstorage>discussion forums.
Dig deeper on NAS management
Related Q&A from Christopher Poelker
RAID can allow for better storage performance and higher availability, and there are many different RAID types. Read a comparison of RAID levels, as ...continue reading
SAN expert Chris Poelker discusses how to change the size of a LUN in a Microsoft cluster server environment.continue reading
SAN expert Chris Poelker compares connecting a SAN with wavelength cabling and dark fiber and discusses the pros and cons of each.continue reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.