When Seattle-based TeleCommunication Systems (TCS) began drawing up plans for a new data center several years ago, they ran into a slight problem. The local utility company said the grid didn’t have enough available power to support the expansion.

The servers, switches, and hard drives housed in data centers produce a tremendous amount of heat. The problem is that they also have very precise temperature tolerances—just a few degrees above the specified temperature, and entire IT networks can come crashing down and affect millions of people. In the case of TCS, the company is handling a large percentage of the 911 calls in the United States through their network, so if their servers overheat, it’s truly a matter of life and death.

For these reasons, data centers are designed with multiple levels of redundancy built into their powerful HVAC systems. Energy use for cooling is an enormous expense for companies like TCS, not to mention an enormous environmental cost for the planet.

It is exactly this equation that Chatsworth Products, Inc. (CPI) specializes in solving. For the last decade (an eternity in terms of IT development), the standard data center design employed the “hot aisle/cold aisle” method, which entails aligning two long rows of cabinets so the intake sides of the two rows are facing each other, then delivering conditioned “cold” air into the intake side “cold” aisle that has been created and venting the hot air out of the opposite exhausted of the “hot aisle.” From a thermodynamics perspective, it was pretty rudimentary compared to the high-tech HVAC acrobatics that LEED designers are used to.

“Even though you had the cabinets properly positioned and were delivering chilled air where you needed to, the room itself had issues where there was heat coming off the back side of the equipment that would somehow find its way back up to the front and mix with the chilled air,” says Sam Rodriguez, a product manager, cabinet and thermal solutions, at CPI, of the old system. Rodriguez is part of a team at CPI that has helped develop an alternative method that employs passive cooling principles to reduce cooling energy use by data centers by up to 40%.

ForWeb2

Introducing an atypical “Galcier White” finish at BendBroadband, a CPI client, not only makes for a sleek look but has increased visibility and allowed for reduced lighting in the facility by 30%.

CPI was brought on the project by the design-build firm McKinstry to help TCS solve the problem of power availability. Using the company’s “cooling wall” system of evaporative-only cooling and precision airflow containment, TCS became one of the most efficient data centers in the Pacific Northwest, saving an estimated 513,000 kilowatts per year. They have since won one national and one regional ASHRAE award with a remarkable average power usage effectiveness’ score of 1.15.

Rodriguez says the approach is quite simple and actually starts with the basic concept of hot aisle/cold aisle, but “takes it to another level in terms of how you control and isolate airflow.” The complicated part is to work within the precise parameters required for a data center equipment cabinet.

CPI_ACE_GBD_160x600PXThe architecture of the cabinets includes many places where hot air can leak back to the cold air zone—in the gaps between the equipment and the cabinet walls, in the space where the cabinet is elevated off of the floor on casters, even in the tiny spaces between the cables where they come in through the floor of the cabinet. Working with the care of a brain surgeon, the CPI team has fine-tuned the innards of data center cabinets to seal off any possible point of leakage between the cold air zone at the front and the hot air zone at the back with carefully crafted air dams.

“Studies have shown that up to 50% of the conditioned air used to cool equipment and the related cost to cool that air is wasted,” Rodriguez says of conventional cabinet design where air is allowed to flow freely. CPI’s Passive Cooling Systems are designed for less than 5% leakage at normal operating conditions, with customers that practice strict airflow management techniques achieving under 2%.

In the old system, the cabinet had a ventilated door on the back side so the hot air could escape. The CPI system utilizes solid rear doors with gaskets while a rectangular duct on top of each cabinet acts like a chimney to whisk air from the exhaust side of the cabinet directly into the return air plenum in the ceiling above. What is most incredible about the CPI system is the highly effective use of seals to isolate hot and cold zones causes the air to be drawn from the cabinet by the air handler without the need for additional fans and the complete segregation of hot and cold makes for very efficient operation of the cooling system. And as Rodriguez points out: what does a fan do besides suck electricity, add heat to the system with its motor, and eventually wear out, putting a mission critical system at risk? With no additional fans required within the cabinets, the only fans used to move air are the existing fans within the server computers and the facility’s air handler.

It turns out that more and more telecom companies are thinking like CPI: passive cooling has multiple layers of benefits. By making data center HVAC systems more efficient, more cabinets can be fit into less space, cutting down on real estate costs. For huge providers like Telefonica Vivo, who provides IT services to half of Brazil’s population and is another one of CPI’s clients, all those savings add up and allow them to make improvements in service and reliability, as well as to keep their commitment to sustainability. Their new 362,000-square-foot facility is the most energy efficient in the country and one of only five in the world that has achieved LEED Gold certification.

One of the other five data centers to achieve LEED Gold, BendBroadband in Bend, Oregon, is also a CPI client. The passive cooling approach was the basis for making that level of energy efficiency happen, but there was another more visually striking aspect of the cabinet design that also contributed to the LEED rating. The traditional color for data center cabinets is jet black, but CPI has introduced a “Glacier White” finish that completely changes the ambiance. The bright, clean look of a white cabinet has a practical importance as well. By having a surface that reflects light instead of absorbing it, data center technicians find it easier to see inside the cabinet for equipment changes and maintenance. It seems like a no-brainer, but the improved visibility has allowed Bend-Broadband to reduce lighting in their data center by 30%.

Passive cooling has been called a low-tech approach to air conditioning, but thinking in those terms has allowed one of the most high-tech industries there is to “smarten” its HVAC approach. Thus, it seems a new industry of green IT infrastructure has been born.