LONDON Electricity used by servers is datacenters in Western European from 2000 to 2005 grew at a rate of 17 percent, slightly above the world average of 16 percent. The study found that server electricity use in the Asia/Pacific region (excluding Japan) grew at a 23 percent annual rate making this region the only one with server electricity use growing at a rate significantly greater than the world average.
A report done by Dr. Jonathan Koomey for AMD uses data from industry analyst firm IDC shows that electricity used by servers in the United States and Europe currently comprise about two thirds of the world’s total, with Japan, Asia/Pacific and the rest of the world each falling at between 10 and 15 percent of the total.
This study forecasts that, based on current growth trends, the U.S. share of total world server electricity use from datacenters will likely decline from 40 percent in 2000 to about one-third by 2010, while the Asia/Pacific region (excluding Japan) will increase its share from 10 percent to about 16 percent over that period.
The absolute electricity consumption for servers in the Asia/Pacific region under this scenario would more than double from 2005 to 2010, requiring electricity capacity equal to output from two new 1000 MW power plants. For the entire world, server consumption from 2005 to 2010 would require additional capacity equal to more than 10 additional 1000 MW power plants.
This new research adds detail to an AMD-sponsored study published in February that identified the worldwide costs associated with datacenter energy use, finding that in 2005 total datacenter electricity consumption in the United States (including servers, cooling and auxiliary equipment) was approximately 45 billion kWh, resulting in total utility bills amounting to $2.7 billion. That study estimated total datacenter power and electricity consumption for the world to cost $7.2 billion annually.