The artificial intelligence industry is scrambling to reduce its massive energy consumption through better cooling systems, more efficient computer chips and smarter programming — all while AI usage explodes worldwide.
AI depends entirely on data centers, which could consume three percent of the world’s electricity by 2030, according to the International Energy Agency. That’s double what they use today.
Experts at McKinsey, a US consulting firm, describe a race to build enough data centers to keep up with AI’s rapid growth, while warning that the world is heading toward an electricity shortage.
Photo: Reuters
“There are several ways of solving the problem,” explained Mosharaf Chowdhury, a University of Michigan professor of computer science.
Companies can either build more energy supply — which takes time and the AI giants are already scouring the globe to do — or figure out how to consume less energy for the same computing power.
Chowdhury believes the challenge can be met with “clever” solutions at every level, from the physical hardware to the AI software itself.
For example, his lab has developed algorithms that calculate exactly how much electricity each AI chip needs, reducing energy use by 20-30 percent.
‘CLEVER SOLUTIONS’
Twenty years ago, operating a data center — encompassing cooling systems and other infrastructure — required as much energy as running the servers themselves. Today, operations use just 10 percent of what the servers consume, says Gareth Williams from consulting firm Arup. This is largely through this focus on energy efficiency.
Many data centers now use AI-powered sensors to control temperature in specific zones rather than cooling entire buildings uniformly.
This allows them to optimize water and electricity use in real-time, according to McKinsey’s Pankaj Sachdeva.
For many, the game-changer will be liquid cooling, which replaces the roar of energy-hungry air conditioners with a coolant that circulates directly through the servers.
“All the big players are looking at it,” Williams said.
This matters because modern AI chips from companies like Nvidia consume 100 times more power than servers did two decades ago.
Amazon’s world-leading cloud computing business, AWS, last week said it had developed its own liquid method to cool down Nvidia GPUs in its servers — avoiding the need to rebuild existing data centers.
“There simply wouldn’t be enough liquid-cooling capacity to support our scale,” Dave Brown, vice president of compute and machine learning services at AWS, said in a YouTube video.
US VS CHINA
For McKinsey’s Sachdeva, a reassuring factor is that each new generation of computer chips is more energy-efficient than the last.
Research by Purdue University’s Yi Ding has shown that AI chips can last longer without losing performance.
“But it’s hard to convince semiconductor companies to make less money” by encouraging customers to keep using the same equipment longer, Ding said.
Yet even if more efficiency in chips and energy consumption is likely to make AI cheaper, it won’t reduce total energy consumption.
“Energy consumption will keep rising,” Ding predicted, despite all efforts to limit it. “But maybe not as quickly.”
In the US, energy is now seen as key to keeping the country’s competitive edge over China in AI.
In January, Chinese startup DeepSeek unveiled an AI model that performed as well as top US systems despite using less powerful chips — and by extension, less energy.
DeepSeek’s engineers achieved this by programming their GPUs more precisely and skipping an energy-intensive training step that was previously considered essential.
China is also feared to be leagues ahead of the US in available energy sources, including from renewables and nuclear.
Recently the Chinese Nationalist Party (KMT) and its Mini-Me partner in the legislature, the Taiwan People’s Party (TPP), have been arguing that construction of chip fabs in the US by Taiwan Semiconductor Manufacturing Co (TSMC, 台積電) is little more than stripping Taiwan of its assets. For example, KMT Legislative Caucus First Deputy Secretary-General Lin Pei-hsiang (林沛祥) in January said that “This is not ‘reciprocal cooperation’ ... but a substantial hollowing out of our country.” Similarly, former TPP Chair Ko Wen-je (柯文哲) contended it constitutes “selling Taiwan out to the United States.” The two pro-China parties are proposing a bill that
March 9 to March 15 “This land produced no horses,” Qing Dynasty envoy Yu Yung-ho (郁永河) observed when he visited Taiwan in 1697. He didn’t mean that there were no horses at all; it was just difficult to transport them across the sea and raise them in the hot and humid climate. “Although 10,000 soldiers were stationed here, the camps had fewer than 1,000 horses,” Yu added. Starting from the Dutch in the 1600s, each foreign regime brought horses to Taiwan. But they remained rare animals, typically only owned by the government or
It starts out as a heartwarming clip. A young girl, clearly delighted to be in Tokyo, beams as she makes a peace sign to the camera. Seconds later, she is shoved to the ground from behind by a woman wearing a surgical mask. The assailant doesn’t skip a beat, striding out of shot of the clip filmed by the girl’s mother. This was no accidental clash of shoulders in a crowded place, but one of the most visible examples of a spate of butsukari otoko — “bumping man” — shoving incidents in Japan that experts attribute to a combination of gender
Last month, media outlets including the BBC World Service and Bloomberg reported that China’s greenhouse gas emissions are currently flat or falling, and that the economic giant appears to be on course to comfortably meet Beijing’s stated goal that total emissions will peak no later than 2030. China is by far and away the world’s biggest emitter of greenhouse gases, generating more carbon dioxide than the US and the EU combined. As the BBC pointed out in their Feb. 12 report, “what happens in China literally could change the world’s weather.” Any drop in total emissions is good news, of course. By