Implementing Sustainable Data Center Practices

by | Jul 17, 2012

This article is included in these additional categories:

Environmental stewardship, innovation and leadership are becoming increasingly important as we take steps to create a sustainable environment for future generations. This is particularly important when it comes to construction of new data centers or major renovations of older ones.

The Leadership in Energy & Environmental Design (LEED) certification process offers a useful framework for organizing green strategies. To earn LEED certification, buildings must exhibit environmental responsibility in seven areas:

  • Sustainable sites
  • Water efficiency
  • Energy and atmosphere
  • Materials and resources
  • Indoor environmental quality
  • Innovation and design
  • Regional credits

The following sections will discuss data center sustainability strategies in the first two areas of LEED green building certification: sustainable sites and water efficiency. Please visit my previous articles (Understanding Data Center Sustainability and Difficulties of Going Green in the Data Center) for background on understanding data center sustainability and a synopsis on the difficulties of going green in the data center, and stay tuned for more in the series, which will provide best practices for achieving LEED in each of the remaining five LEED areas.

Sustainable Sites

Limiting physical sprawl is one of the most important goals of sustainable site development. Companies can reduce the amount of space their data centers occupy by deploying compact infrastructure resources. For example, many late-model uninterruptible power system (UPS) products feature footprints as much as 50 to 60 percent smaller than previous-generation models. Similarly, companies that operate their data center at 400V can eliminate transformer based Power Distribution Unit (PDU) cabinets, reducing their power distribution footprint by 50 to 60 percent.  Within the mechanical operations, using commercially packaged air handling units and In Row cooling (where needed) can eliminate traditional Computer Room Air Conditioners (CRACs) which take up significant space within many data halls.

Organizations can further promote sustainable site development in their new and existing data centers by embracing practices like the following:

  • Locating new facilities in existing industrial zones instead of on undeveloped land
  • Installing electric vehicle charging stations in the parking lot
  • Minimizing pollution during construction by controlling soil erosion, waterway sedimentation and airborne dust generation.
  • Providing easy access to public transportation, bicycle racks and changing rooms
  • Limiting parking capacity to the minimum mandated under local zoning regulations
  • Reducing light pollution by automatically shutting off interior lights during late-night hours and providing external lights only as required for safety and comfort

Water efficiency

The average enterprise data center consumes enormous volumes of water. In fact, a 15-megawatt data center can use up to 360,000 gallons/1,363 liters of water a day to cool high-density server systems, according to data center designer James Hamilton. Data center operators can reduce water usage by employing these and other techniques:

Raising server inlet temperatures: Historically, data center operators typically have kept “cold aisle” temperatures at roughly 65oF/18oC. According to recent a publication by the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE), however, certain classes of data centers can safely operate at temperatures as high as 104oF/40oC. While few companies are pushing these extremes, it is not uncommon to see operating targets approaching 80oF/26oC. Raising data center temperatures even a few degrees can dramatically reduce the amount of water consumed by cooling systems.

Raising inlet temperatures does pose potential risks. For one, should cooling systems in a warmer data center fail, managers will have significantly less time to react before their servers reach thermal shutdown. Additionally, operating your data center at higher temperatures can shorten the lifespan of UPS batteries, potentially resulting in higher maintenance and replacement costs.

Employing air-side and water-side economization: Most data centers collect hot exhaust air and return water, chill it and then re-circulate it. Facilities that utilize “air-side economization,” however, simply pump hot internal air out of the building and pipe cool external air in. “Water-side economization” is a similar process in which return water is pumped through an external radiator or cooling tower rather than a chiller. Both techniques can significantly lower cooling-related water requirements. Moreover, studies have shown them to be viable options for at least part of the day even in mild or warm climates.

Like raising server inlet temperatures, however, air- and water-side economization are not for everyone. For one, they tend to deliver best results in cooler northern climate zones. Thus, the further south a given data center is located, the less likely it is that air- and water-side economization will be cost-effective options. In addition, pumping outside air into a server room without proper filtering and conditioning can expose sensitive electronic equipment to higher amounts of humidity and airborne particulates, shortening their lifespan and possibly causing failures.

Utilizing recycled water: Especially progressive data center operators are beginning to use recycled water sources, such as rain water and waste water, in place of fresh water. Google Inc., for example, has switched to using recycled water at its Douglas County, Ga., data center rather than continue to tap drinking water as it had when the facility opened in 2007.

Though still largely an experimental strategy, using recycled water can considerably ease pressures on local water supplies. On the other hand, it also imposes added costs. Companies must equip their facilities with purification systems capable of filtering incoming water sufficiently for safe use in cooling systems.

For example, Google built a “side stream facility” five miles from the data center that treats and diverts as much as 30 percent of the recycled water and sends it to the data center to be used for evaporative cooling. What water remains is sent to Google’s own Effluent Treatment Plant where it is disinfected and cleaned before it flows into the Chattahoochee River.

Closing Thoughts

There are many sound reasons for wanting a greener data center, realizing that goal cost effectively is easier said than done. Just the same, most organizations can benefit from at least some green strategies without compromising corporate growth or IT reliability.

Some sustainability strategies, such as using recycled water, are still experimental, require heavy upfront investments or come with significant potential drawbacks. Most, however, are proven, cost-effective and relatively simple.

Stay tuned for the following series of columns, in which I will explore available methods of achieving LEED in the remaining five outlined areas. Up next: energy and atmosphere, and materials and resources.

John Collins is segment manager of data centers for Eaton Corporation. To learn more about Eaton’s sustainable electrical solutions, please visit www.eaton.com/leed.

Additional articles you will be interested in.

Stay Informed

Get E+E Leader Articles delivered via Newsletter right to your inbox!

This field is for validation purposes and should be left unchanged.
Share This