By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
New developments such as 2.5-inch drives and blade servers have tremendously increased the compactness of SANs and other storage products. This is good news in terms of saving floor space, but bad news for equipment life. Storage density is directly proportional to heat production and the new technologies mean even more component-killing heat.
This exacerbates a heat problem that has been building for a long time. Increasingly, storage hardware is moving out of climate-controlled glass houses and into smaller spaces. Often the drives and switches for a modern enterprise SAN live in a room that was not originally designed to house a mass of electronic equipment. Sometimes the "server room" is a former broom closet.
Even the traditional climate controlled computer room can be inadequate for the new SANs and other systems. "Climate control" often means that the overall temperature and humidity are within specification, but the air that is actually being pulled into the equipment has been pre-heated by other equipment. Since staff at most computer centers have only a sketchy idea of their actual air flow patterns the problem is often undetected.
Cooling issues for SANs and other data processing equipment are gaining awareness, however. The American Society of Heating Refrigeration and Air Conditioning (ASHRAE) recently published new guidelines for the thermal environment for data centers released. One of the most obvious changes is that the Society has combined specifications for equipment manufacturers and data centers in an integrated approach to keeping electronic equipment cool.
More significantly from the standpoint of storage administrators, the new guidelines specify inlet temperatures for equipment rather than just the overall temperature of the room. This means that making sure your equipment is within those guidelines requires measuring at the cooling inlet for each piece of equipment, not just hanging a thermometer on the wall.
The payoff for the added complication is longer equipment life. Heat is the great enemy of electronic equipment. It doesn't fry components like a power surge; instead it slowly, inevitably sucks the life out of it like a thermal vampire.
In general, the effort made to meet cooling standards should be proportional to the cost and criticality of the equipment. The most basic tool is a thermometer to measure inlet air temperatures for equipment. The simplest technique for fixing a simple heating problem is to move or rearrange the components in the computer room. Often something as simple as changing the order of equipment in the rack or increasing the spacing between components will be enough to improve cooling significantly.
It is also vitally important not to run disk arrays or other equipment without their enclosures. Removing the enclosure can actually increase the heat load on the critical components because it disturbs the carefully engineered air flow over vital components. For the same reason it is important to keep openings plugged with plates when the spaces are not occupied by components.
At the other end of the scale, for large, expensive or critical installations, a number of sophisticated tools are available to make sure the equipment stays cool. A network of thermometers connected to a data collection system can monitor all the equipment inlets, for example. Modeling software such as Airpak from Fluent Software can determine temperature and airflow in the center in the design stages and conduct "what-if" studies without moving a lot of equipment repeatedly.
It is also important to establish policies for the data center which promote proper cooling. For example, establishing 'hot side' and 'cool side' aisles where air is drawn in from the cool aisle and discharged in the hot aisle.
Finally, simple awareness plays a large part in keeping your SAN and other equipment cool. If the storage and other IT people are aware of the importance of cooling and how to keep electronics cool, they will automatically spot or head off problems before they become serious.
For more information:
SAN School: What makes a SAN stop
Tip: Fireproof your files
About the author: Rick Cook has been writing about mass storage since the days when the term meant an 80 K floppy disk. The computers he learned on used ferrite cores and magnetic drums. For the last 20 years he has been a freelance writer specializing in storage and other computer issues.