The artificial intelligence (AI) hype that made Nvidia Corp the world’s biggest company has come with a price for the climate. Data centers housing its powerful chips are gorging power and belching carbon dioxide, and sobering figures now reveal the extent of the problem.
Data centers would use 8 percent of US power by 2030, compared with 3 percent in 2022, as their energy demand grows by 160 percent, a recent report from Goldman Sachs Group Inc said.
AI is doing more to worsen the climate emergency than solve it, as some AI firms have touted. So great are the energy needs that utilities are extending their plans for coal plants, while Microsoft Corp is building gas and nuclear facilities to keep its servers humming.
Illustration: Tania Chou
Add this all to the growing discontent about generative AI tools. To not only stem the tide, but also uphold their goals of building AI “for humanity,” tech firms like OpenAI, Microsoft and Alphabet Inc’s Google must grow their teams addressing the power issue. It would certainly be possible. A few signs of progress suggest the trick might be to redesign their algorithms.
Generative AI models like ChatGPT and Anthropic’s Claude are impressive, but their neural network architectures demand vast amounts of energy, and their indecipherable “black box” decisionmaking processes make them difficult to optimize. The current state of AI is like trying to power a small car with a huge gas-guzzling engine: It gets the job done, but at an enormous cost.
The good news is that these “engines” could get smaller with greater investment. Researchers at Microsoft, for instance, have developed a so-called “1 bit” architecture that can make large language models about 10 times more energy efficient than the current leading systems. This approach simplifies the models’ calculations by reducing values to 0 or 1, slashing power consumption without sacrificing too much performance. The resulting tech is not the most capable, but a good example of a “contrarian” approach that can immediately reduce AI’s cost and environmental impact, says Steven Marsh, founder of UK-based start-up Zetlin Ltd, which is working on building more efficient systems.
Marsh says he is making progress. His team recently trained a neural network-based AI model on an Nvidia graphics processing unit, and the system heated up so much that they had to bring fans into the room over five days. When they ran the same model with their proprietary, non-neural network technology, it used just 60 percent of the power. The current approach, Marsh says, is “like putting a rocket engine on a bicycle.”
Nvidia has also taken promising steps toward addressing the energy problem. A couple of years ago, it developed a new format for its chips to process AI calculations with smaller numbers, making them faster and less power-hungry.
“Just that little tweak on the silicon saved a lot of energy,” Marsh says.
If companies designing AI systems take better advantage of that tweak, they could save energy eventually.
It does not help that AI companies are in an arms race. OpenAI and Anthropic have raised US$11.3 billion and US$8.4 billion respectively, data provider PitchBook said. Much of that money is not going to recruitment (they each have workforces of just a few hundred people). Instead, it is being poured into running servers that can train and run their models, even as their investment leads to diminishing returns. (There is evidence that the latest text and vision-reading systems are showing smaller advancements in areas like accuracy and capability.)
Those companies, along with Google, Microsoft and Amazon.com Inc, should devote additional money to refashioning their algorithms to save energy and cost. Collectively, it has been done before. Data centers managed to keep their power demands flat between 2015 and 2019, even as their workloads tripled, because their operators found ways to make them more efficient, Goldman Sachs said.
OpenAI chief executive officer Sam Altman has talked up nuclear fusion as an answer to the problem, having personally invested US$375 million into an enterprise called Helion Energy. However, he might be creating hype around an energy technology that would not be commercialized for several decades.
Rather than outsource responsibility to a futuristic energy source or superintelligent AI that does not exist yet, tech firms should put greater focus on making their models more energy efficient now. After all, breaking away from established and inefficient systems was how this revolution began in the first place.
Parmy Olson is a Bloomberg Opinion columnist covering technology. She is a former reporter for the Wall Street Journal and Forbes. This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
As strategic tensions escalate across the vast Indo-Pacific region, Taiwan has emerged as more than a potential flashpoint. It is the fulcrum upon which the credibility of the evolving American-led strategy of integrated deterrence now rests. How the US and regional powers like Japan respond to Taiwan’s defense, and how credible the deterrent against Chinese aggression proves to be, will profoundly shape the Indo-Pacific security architecture for years to come. A successful defense of Taiwan through strengthened deterrence in the Indo-Pacific would enhance the credibility of the US-led alliance system and underpin America’s global preeminence, while a failure of integrated deterrence would
It is being said every second day: The ongoing recall campaign in Taiwan — where citizens are trying to collect enough signatures to trigger re-elections for a number of Chinese Nationalist Party (KMT) legislators — is orchestrated by the Democratic Progressive Party (DPP), or even President William Lai (賴清德) himself. The KMT makes the claim, and foreign media and analysts repeat it. However, they never show any proof — because there is not any. It is alarming how easily academics, journalists and experts toss around claims that amount to accusing a democratic government of conspiracy — without a shred of evidence. These
The Executive Yuan recently revised a page of its Web site on ethnic groups in Taiwan, replacing the term “Han” (漢族) with “the rest of the population.” The page, which was updated on March 24, describes the composition of Taiwan’s registered households as indigenous (2.5 percent), foreign origin (1.2 percent) and the rest of the population (96.2 percent). The change was picked up by a social media user and amplified by local media, sparking heated discussion over the weekend. The pan-blue and pro-China camp called it a politically motivated desinicization attempt to obscure the Han Chinese ethnicity of most Taiwanese.
On Wednesday last week, the Rossiyskaya Gazeta published an article by Chinese President Xi Jinping (習近平) asserting the People’s Republic of China’s (PRC) territorial claim over Taiwan effective 1945, predicated upon instruments such as the 1943 Cairo Declaration and the 1945 Potsdam Proclamation. The article further contended that this de jure and de facto status was subsequently reaffirmed by UN General Assembly Resolution 2758 of 1971. The Ministry of Foreign Affairs promptly issued a statement categorically repudiating these assertions. In addition to the reasons put forward by the ministry, I believe that China’s assertions are open to questions in international