The global information technology (IT) sector is responsible for 2 percent of global greenhouse gas emissions, according to a study by The Climate Group, and given the proliferation of technology today (e.g., smart phones, tablets, cloud computing, big data, etc.), reliance on the data centers and systems that support IT operations around the world is only going to continue to increase.
Additionally, in a 2013 study of the North American data center market, 98 percent of respondents, who represented large companies with either $1.0 billion in annual revenues or 5,000-plus employees, will definitely or probably expand their data center footprints by 2014.
While there are some legitimate concerns about the amount of energy required to keep these data centers running and to prevent them from overheating, and the associated impact on the environment, the total amount of energy required to keep these installations cool can be meaningfully reduced by taking advantage of recent industry changes, new technologies and innovative approaches that allow owners and operators to cool facilities using free, outside air.
Key Industry Changes Allow for More Efficient Cooling
Although data centers previously had been subjected to clean-room type environmental controls, recent industry changes allow for more efficient cooling technologies.
Using free air to cool data centers, for example, was not widely considered a viable means of maintaining critical temperatures until a series of ASHRAE TC 9.9 recommendations was published in 2011(it’s worth noting that expanded guidelines were also introduced in 2004 and 2008). Before 2011, however, all but the most progressive of data center operators were reluctant to test the higher temperatures and wider humidity bands on their own critical operations.
The 2011 recommendations provided significantly more guidance and data to support prolonged data center operation within the temperature and humidity bands originally put forth in 2008. Furthermore, ASHRAE delineated the energy-efficiency benefits associated with added economization hours. Until the recent industry-wide acceptance of the 2011 guidelines, designing a data center to intentionally allow the ingress of large quantities of outside air was considered somewhat radical and risky.
Historically, data centers were typically cooled with either thirsty condenser and chiller plants or direct-expansion, compressor-driven circuits at all times, with little consideration given to using low ambient conditions to reduce energy consumption, other than some progressive deployments of water-side economization. The tight window of environmental control was driven by older IT equipment requirements, whereby these conditions were necessary to maintain the reliability of mainframe systems, stand-alone proprietary equipment, and rack-mounted units.
With the expanded operating parameters, however, outside air could be discussed openly and seriously considered for use in business critical applications. No longer would the data center industry be tied to tightly sealed data center rooms wholly dependent upon compressor-centric technology to control the environment for all of the 8,760 hours that are in one full year.
An Inflection Point for the Data Center Industry
The movement led by ASHRAE TC 9.9 to increase the allowable temperature and humidity windows, implemented with buy-in from the major data center IT equipment manufacturers, and all the associated changes, represents an inflection point for the data center industry.
Key attributes of the cumulative changes include: further standardization of recommended data center layouts, as well as of environmental measuring points, to ensure consistency in the measurement of requirements and performance; empirical data measuring actual IT equipment failure rates at various operating temperatures/humidity levels; and most importantly, buy-in from the equipment manufacturing community that it would support the operation of equipment within the recommended and allowable environmental bands.
Although some legacy facilities and more conservative operators have been slower to adopt these changes, there has been widespread acceptance in the industry. This, in turn, has enabled the deployment of new, more efficient cooling topologies, including free-air cooling, and has helped lower the energy consumption and overall cost of data center operations.
Free Air is Now a ‘Must-Consider’ Technology
Just one example of the practical application of the 2011 guidelines is present in Digital Realty’s recent deployment of multiple data centers in Ashburn, Virginia. These facilities use rooftop units to deliver supply-air via vertical shafts to traditional pressurized raised-floor plenums, a design that has proven successful in the more moderate climates of Santa Clara, California.
With the use of primary and secondary filtration systems, these same installations take advantage of the expanded humidity band to provide full-free cooling, with no mechanical component, for 48 percent of the year (around 4,205 hours). Mechanical cooling is required for only 43 percent of the year, with the remaining time covered by partial-free cooling. This represents significant savings over the cost of using traditional mechanical cooling systems 100% of the time.
For most US and some Asian cities, it is possible to build data centers that rely on the local climate for their cooling needs for half of the year. Most Western European cities can support chiller-less data center operations for all but a very small percentage of the year. A very important added benefit to using outside air is that it can sever, or at least drastically reduce, industry reliance on water resources for cooling data centers. Coupling this with the very real energy advantages make it a must-consider technology for future data center designs.
By partnering with data center providers that use outside air to cool data centers, companies can drive down total cost of occupancy, save on water usage, simplify operations, improve power usage effectiveness (PUE) and realize significant energy savings. In fact, PUEs of less than 1.3 are achievable. Outside air should be considered a viable option when considering cooling choices for data center deployments.