Google, Data Centers Using Less Power Than Expected

Biswajit.HD

Member
Joined
5 Aug 2011
Messages
2,287
Reaction score
281
Data centers have been using less electricity than you think ... or at least, compared with what they have in the past.
According to a study by Jonathan Koomey, a consulting professor at Stanford and a climate and energy researcher, data center energy use in the last five years rose only about 56% vs. doubling in the period between 2000 and 2005. And in the U.S., it rose only 36% instead of doubling.

Electricity used in global data centers in 2010 accounted for between 1.1% and 1.5% of total electricity use, the Koomey study found. For the U.S. that number was between 1.7 and 2.2%.

And less than 1% of electricity used by data centers worldwide was consumed by Google, perhaps the highest-profile user of data center servers.

The Koomey report attributes the lower usage to a lower-than-predicted server installed base rather than energy-efficiency improvements. Indeed, electricity used in U.S. data centers in 2010 was significantly lower than predicted by the EPA's 2007 report to Congress on data centers, the Koomey report found.

Growth in the installed base of servers in data centers had already begun to slow by early 2007 because of virtualization and other factors. And the 2008 recession, combined with further improvements in virtualization, led to a significant reduction in actual server installed base by 2010 when compared with an installed base forecast published in 2007 by market research firm IDC.

Servers are still the largest and most important electricity consumer in data centers, the Koomey report states. And Google is estimated to have about 900,000 of them, compared to 25,000 in 2000.

Growth in electricity used per server may have accounted for a larger share of demand growth from 2005 to 2010 than it did in 2000 to 2005, the report suggests. Meanwhile, the dominant driver of electricity demand from 2000 to 2005 was growth in the installed base of volume servers, which doubled over that five-year period for the U.S. and the world, according to the Koomey report.

Indeed, Koomey notes that the main reason for the lower estimates in this study is the lower IDC installed base estimates, and not the operational improvements and installed base reductions from virtualization.

And going forward, IDC forecasts virtually no growth in installed base from 2010 to 2013, which presages continued slower growth in data center electricity use, the Koomey study states. The IDC forecast assumes virtualization will become more and more prevalent, reducing the need to install more physical servers, which improves energy efficiency while driving up utilization of remaining servers, which spreads energy use and other costs over more computations per server.

And the impact of cloud computing on energy usage? Cloud computing installations typically have much higher server utilization levels and infrastructure efficiencies than do in-house data centers, the Koomey report states. So increased adoption of cloud should result in lower electricity use than if the same computing services were delivered using more conventional approaches.

Source : PC World
 
Back
Top Bottom
AdBlock Detected

We get it, advertisements are annoying!

Sure, ad-blocking software does a great job at blocking ads, but it also blocks useful features of our website. For the best site experience please disable your AdBlocker.

I've Disabled AdBlock