White Paper: IT Buyers Need to Look at the Total Cost of Data Centers

by | Aug 20, 2009

This article is included in these additional categories:

datacentercostIn data centers, all costs related to the purchase of electricity and almost all facility costs are directly related to the power use of IT equipment, according to a new white paper released by Intel and Microsoft. This means power used per thousand dollars of server acquisition cost is the most important driver of power and cooling costs in data centers.

According to the white paper, Server Energy Efficiency Implications on Large Scale Datacenters, the power-related capital costs for cooling, backup power, and power distribution are significant — roughly $25,000 per kW of IT power use — and together with the electricity costs account for roughly half of total annualized costs in typical data centers.

The research segments power used per thousand dollars of server cost into two components: performance per dollar of server cost and performance per watt. Performance per dollar of server cost has been increasing more rapidly than performance per watt in recent years, which has led to increases in the power use per server cost, according to the paper.

The result is that the indirect costs for cooling and power distribution start to offset the performance related benefits of Moore’s law, say the researchers. They also conclude that too little attention has been paid to the true total costs for data center facilities, and these trends have important implications for the design, construction and operation of data centers.

A key finding in the paper indicates that an IT buyer who does not assess the total cost for purchasing new servers but instead focuses solely on performance per dollar of server acquisition cost will overestimate the benefits from buying more computing power. This mismatch between costs and benefits is the primary reason why institutional changes are needed in most data-center operations, which traditionally have separate budgets for the IT and facilities departments, suggest the researchers.

There are technical solutions for improving data center efficiency but the most important and most neglected solutions relate to institutional changes that can help companies focus on reducing the total costs of computing services, say the researchers.

The reason: IT departments generally don’t pay the electric bill or the costs to build cooling or power distribution capacity, so they don’t demand high-efficiency servers because the costs for inefficiency come out of someone else’s budget. Cloud computing providers have generally been ahead of the industry in fixing these misplaced incentives, which gives them an economic advantage over in-house corporate data center operators, according to the research.

There are some signs that the industry’s focus on the reduction of power use in servers since 2006 has been paying off, although more research is needed to confirm this finding, say the researchers. Click here for the white paper.

The research was co-authored by Dr. Jonathan Koomey, Project Scientist at Lawrence Berkeley National Laboratory, Consulting Professor at Stanford University and visiting professor at Yale University’s School of Forestry and Environmental Studies. Dr. Koomey is one of the leading international experts on electricity used by computers, office equipment, and data centers, as well as in energy conservation technology, economics, policy and global climate change.

Intel and Microsoft also offer other peer-reviewed research papers that assess the energy and environmental impacts of information technology in the areas of computer performance and music downloads.

Additional articles you will be interested in.

Stay Informed

Get E+E Leader Articles delivered via Newsletter right to your inbox!

This field is for validation purposes and should be left unchanged.
Share This