Professionals in the computer business are used to dealing with the many challenges associated with cooling servers and other computing equipment. And even they have a hard time.
Now consider the plight of average small business owners. No matter what line of business they're involved in, they've had technology thrust upon them. Servers, networking gear and storage hardware have become a fact of life for even the smallest firms. Most business owners typically set up these machines haphazardly as needed, and that can lead to trouble.
“Excess heat can be a problem for small businesses because their servers now run greater loads and for longer periods. Many of them need to be available 24/7,” said Anil Miglani, an analyst with AMI-Partners Inc. “It is not uncommon to find people complaining about unavailable services during late evening or weekends, which results in lost productivity.”
“Most people haven't experienced a major failure and just believe that somehow everything will be fine,” said Nickolett. “Usually they don’t pay attention until they experience a big outage, lose data, or have to replace an entire system – only then are they very receptive to best practices.”
Cooling Best Practices
There are, of course, a litany of best practices that data centers have applied for years, many of which apply to small server rooms and closets. This includes: arranging servers in rows so that the cold air comes in the front and is expelled out the back; keeping the doors to the room closed; ensuring that the flow of cold air makes it to the equipment; having redundant AC – if one unit fails, another takes over; and more.
Bob Spengler, product manager for Liebert Precision Cooling, at Emerson Network Power, laid out specific tips relating to equipment rooms under 500 square feet. Number one on the list is to avoid using AC systems designed for humans – known as comfort AC. This, he said, is probably the number one failing in small business cooling – next to not having any cooling at all.
“Cooling equipment needs to be specifically designed for computers and have adequate temperature and humidity controls,” said Spengler. “If you don’t control the humidity level you either end up with damaged equipment due to static electricity or servers dripping with water due to condensation. Also, it can cost about 50 percent more in operating costs if you try to make do with a comfort cooler.”
He advised installing precision cooling gear that comes in various configurations depending on the room needs. You can buy rack enclosures, for example, where the cooling is built into the bottom of the rack. You then slide your servers into the enclosure to keep them cool. Alternatively, you can buy wall-mount, ceiling-mount or standalone cooling modules that are more than adequate for small spaces. Liebert and APC are two of the principal suppliers. Both offer plenty of options for small businesses.
Another best practice is to seal off the space where equipment operates. That means: no open windows or doors; no missing ceiling tiles: no cracks where air can escape such as between ceiling tiles, under doors or where piping comes through a wall; and shutting off the ducts and vents from the existing comfort AC system – you don’t want the two AC systems mixing. The more control you exert over the air in your server room, the fewer surprises you will experience.
“Opening a window to let the heat out of a small space that contains a few servers is wrong since you have no control over the humidity,” said Spengler.
Tight control of temperature is also a vital factor in equipment uptime. It isn’t common knowledge, but temperature changes affects equipment reliability. For every 18 degrees F above 70 degrees, electronics reliability is reduced by 50 percent. Therefore, it is best to set the AC to run at around that level or just a little higher – no more than 77 degrees.
The next point to avoid, if possible, is mixing people with a lot of computers. That adds heat to the space and puts a lot of strain on the AC and the servers. So place your servers and other equipment in a closet or small room in order to create a tightly controlled environment.
For companies with several server racks, it is vital to ensure the cold air actually gets to where it is needed and doesn’t mix with the hot air being shoved out the back of the server. You can have a situation, for example, where you are pumping enough cold air into the room, but it isn’t getting to the top of the racks.
“Eighty percent of failures due to heat will be found in the top third of the rack,” said Spengler.
He recommends the use of blanking panels on the front of the rack to cover the spots where servers are missing. Without this, cold air runs through to the back of the rack and into the hot air at the back, rather than making its way up to the top of the rack. In situations where there is a whole rack sitting empty, you can put blocker panels at the front to prevent any cold air getting in.
International Data Corp. (IDC) has been tracking data-center power and cooling issues for years via an annual survey. And small businesses are suddenly in the spotlight.
“Smaller installations such as server closets and rooms register highest in terms of cooling issues,” said Jed Scaramella, an analyst at IDC. “There are some very easy solutions customers can adopt such as blanking panels between racks to improve air flow.”
He recommends that small businesses call in outside help to figure out their cooling needs as there are too many ways to get it wrong. And the results can be disastrous. “Consider the expertise of a third party service provider,” said Scaramella. “With space and thermal issues, it is somewhat of a science that goes into running a computer room.”
Drew Robb is a Los Angeles-based freelancer specializing in technology and engineering. Originally from Scotland, he graduated with a degree in geology from Glasgow's Strathclyde University. In recent years he has authored hundreds of articles as well as the book, Server Disk Management by CRC Press.
This article was first published on SmallBusinessComputing.com.