19 February 2007

Sucking Juice through a Net


Demand grows, but data centers don't hog power
Benjamin Pimentel, Chronicle Staff Writer
Thursday, February 15, 2007

Data centers are sucking up more electricity as more people and organizations log on to the Internet. But there's been some disagreement over how power-hungry the servers running the nation's network are.

Take the previously accepted belief -- now dismissed as inaccurate -- that the Internet and all the computer equipment in the nation requires 13 percent of total electricity use in the United States.

A study released Wednesday says that energy consumption at data centers is growing rapidly, doubling from 2000 to 2005, according to a report by Jonathan Koomey, a staff scientist of Lawrence Berkeley National Laboratory and a consulting professor at Stanford University.

But overall, the amount energy servers consume is only about 1.2 percent of the electricity used in the country, although the rate at which it has grown might still be cause for worry.

The study was commissioned by chipmaker Advanced Micro Devices partly in response to growing concern over corporate customers' rising energy bills.

"If you see that there might be a problem, you have to ask how big is the problem," Koomey said. "The purpose of the study is to give a reasonable estimate of how much electricity is used. From that, you can then figure out whether there are actions that industry can take or that government can take."

According to the study, servers and the infrastructure used to maintain these machines use about 45 billion kilowatt hours a year. That's equivalent to the amount of power used by the state of Mississippi in 2005, Koomey said.

The United States consumed 3,661 billion kilowatt hours, or nearly $300 billion, in 2005, Koomey said.

Data centers in the United States spent $2.7 billion on electricity in 2005, the report said. Worldwide, the total cost was $7.2 billion.

The AMD report is based on comprehensive data collected by International Data Corp. on the existing, historical and projected installed base and shipments of servers from 1996 to 2009.

It underscores what many experts and government agencies have come to realize: Many of the projections of how much power network computing consumes were overstated.

Claudia Chandler, assistant executive director at the California Energy Commission, said some sectors even blamed the energy crisis about six years ago on the growth of the Internet.

"It was exactly that -- an urban myth," she said.

James Bushnell, research director of the UC Energy Institute, said the AMD report offers a more realistic view of the Internet's impact on energy.

"In the end, the bottom line is it's not their fault and we shouldn't be looking toward high-tech as either the cause or the cure of our electricity problem," he said.

In fact, AMD's study is based on concerns over the bottom line.

John Fruehe, a development manager at AMD, said the chip company commissioned the study because its customers are increasingly worried about the power consumed at data centers.

"They are starting to max out their power budgets," he said. "Everything that's involved in this is looking at the bottom line."

Koomey said the growing need for power at data centers, with the rapid expansion of the Web, has prompted government and businesses to come up with more energy-efficient networks.

Chandrakant Patel, a research scientist at Hewlett-Packard Labs, said tech companies need to focus on every aspect of the data center, from the microprocessor to the air-conditioning system.

"You really must look at the stack," he said. "It has to be addressed holistically."

Of course, this power usage estimate doesn't include the millions upon millions of PCs around the globe that stay on 24/7 downloading movies, music, and porn...

1 comment:

Cyrus said...

I should really do some simple calculations to figure out how much my it costs me to keep my laptop running 24/7. Got to keep that upload/download ratio high!