The Guardian, London
If you want to start a rumor, how about that Google is going to build its own nuclear power station? The logic is easy. Larry Page, the company's co-founder, reportedly sees "running out of power" as the biggest potential threat to Google, and the electricity needed to run its "server farms" -- tens of thousands of power-hungry computers storing billions of Internet pages -- could soon cost more than the hardware. Partly this is because Google is based in California, where the state solved its 2001 energy crisis by borrowing US$10 billion to buy electricity at massively inflated prices. But the rest of us are heading in the same direction.
We live in a world where the use of chip-based computers and consumer electronics devices is increasing rapidly, while supplies of oil and natural gas are diminishing perhaps even more rapidly. Worse, the threat of global warming means we should now be decreasing our energy use, like the Japanese, not increasing it. And although each individual PC or peripheral may not use much electricity, when you have a billion of them, it adds up.
Sadly, it's impossible to say how much power a PC uses without measuring it, because of variables such as the type of motherboard, the speed of the chip and the power of the graphics card. (A fast graphics card can use more power than the processor.) PC power supplies can range from about 150W to about 650W, and will actually draw more than that during peak loads. However, PCs use much less power when idling and the US Energy Star program -- which PC manufacturers have been following since 1992 -- is aiming to get the power consumption on idle below 50-60W.
The simplest approach is to use the PC's power-saving software to turn the screen and hard drive off and then suspend the whole system after a specified time.
The situation is improving thanks to market trends towards flat screens and the use of portables rather than desktop computers. LCDs use much less power than traditional monitors, and by design, most notebooks use less power than most desktops. At the extremes, the 1GHz Pentium M Ultra Low Voltage chip uses only 5W whereas Intel's hottest chip for gaming, the 3.73GHz Pentium 4 Extreme, can consume up to 115W.
However, Intel has done a U-turn on its processor design goals, which should help. The Pentium design drove up clock speeds (and power consumption) to build the fastest chips. In 2002, Intel executives still assured me that "gigahertz is far from over" and looked forward to a 4.7GHz Pentium codenamed Tejas. Last year, however, they announced a new mantra: "performance per Watt."
Alistair Kemp, a spokesman for Intel, says the company has now developed "a new microarchitecture that will be coming out in the second half of this year." New chips will, he says, "reduce average power use quite substantially." The reduction will be from about 95W, for a fast Pentium 4, to about 65W.
Performance per Watt is also important for the arrays of servers in corporate data centers. Luiz Andre Barroso, principal engineer at Google, has already warned that "the possibility of computer equipment power consumption spiraling out of control could have serious consequences for the affordability of computing, not to mention the overall health of the planet."