While the rise of artificial intelligence (AI) could revolutionize numerous sectors and unlock unprecedented economic opportunities, its energy intensity has raised serious environmental concerns.
In response, tech companies promote frugal AI practices and support research focused on reducing energy consumption, but this approach falls short of addressing the root causes of the industry’s growing demand for energy.
Developing, training and deploying large language models (LLMs) is an energy-intensive process that requires vast amounts of computational power. With the widespread adoption of AI leading to a surge in data centers’ electricity consumption, the International Energy Agency projects that AI-related energy demand would double by next year.
Data centers already account for 1 to 2 percent of global energy consumption — about the same as the entire airline industry. In Ireland, data centers accounted for 21 percent of total electricity consumption in 2023. As industries and citizens shift toward electrification to reduce greenhouse gas emissions, rising AI demand places enormous strain on power grids and the energy market.
Unsurprisingly, Ireland’s grid operator, EirGrid, has imposed a moratorium on new data center developments in Dublin until 2028. Countries such as Germany, Singapore and China have also imposed restrictions on new data center projects.
To mitigate the environmental impact of emerging technologies, the tech industry has begun to promote the concept of frugal AI, which involves raising awareness of AI’s carbon footprint and encouraging end users — academics and businesses — to select the most energy-efficient model for any given task.
However, while efforts to promote more conscious AI use are valuable, focusing solely on users’ behavior overlooks a critical fact: suppliers are the primary drivers of AI’s energy consumption.
Currently, factors like model architecture, data center efficiency and electricity-related emissions have the greatest impact on AI’s carbon footprint.
In addition, as technology evolves, individual users would have even less influence on its sustainability, especially as AI models become increasingly embedded within larger applications, making it harder for end users to discern which actions trigger resource-intensive processes.
These challenges are compounded by the rise of agentic AI — independent systems that collaborate to solve complex problems. While experts see this as the next big thing in AI development, such interactions require even more computational power than today’s most advanced LLMs, potentially exacerbating the technology’s environmental impact.
Moreover, shifting the responsibility for reducing AI’s carbon footprint to users is counterproductive, given the industry’s lack of transparency. Most cloud providers do not yet transparently disclose emissions data specifically related to generative AI, making it difficult to assess the environmental impact of their AI use.
A more effective approach would be for AI providers to provide consumers with detailed emissions data. Increased transparency would empower users to make informed decisions, while encouraging suppliers to develop more energy-efficient technologies.
With access to emissions data, consumers could compare AI applications and select the most energy-efficient model for a specific task. Businesses could also more easily choose a traditional information technology solution over an energy-intensive generative AI system if the overall impact is clear from the beginning. By working together, AI companies and consumers could balance AI’s potential benefits with its environmental costs.
To be sure, frugal AI might lead to some efficiency gains, but it does not address the core problem of AI’s insatiable energy demand. By providing greater transparency about energy consumption, sharing comprehensive emissions data and developing standardized metrics for AI models, companies could help clients optimize their carbon budgets and adopt more sustainable practices.
The automotive industry offers a useful model for increasing energy transparency in AI development. By labeling the energy efficiency of their vehicles, auto manufacturers allow buyers to make more sustainable choices. Generative AI providers could adopt a similar approach and establish standardized metrics to capture the environmental impact of their models.
One such metric could be electricity consumption per token, which quantifies the amount of energy required for an AI model to process a single unit of text.
Just as fuel-efficiency standards allow car buyers to compare different models and hold manufacturers accountable, businesses and individual users need reliable tools to evaluate the environmental impact of AI models before deploying them.
By introducing transparent metrics, technology companies could not only steer the industry toward more sustainable innovation, but also ensure that AI helps combat climate change instead of contributing to it.
Boris Ruf is Research Scientist Lead at AXA.
Copyright: Project Syndicate
The term “assassin’s mace” originates from Chinese folklore, describing a concealed weapon used by a weaker hero to defeat a stronger adversary with an unexpected strike. In more general military parlance, the concept refers to an asymmetric capability that targets a critical vulnerability of an adversary. China has found its modern equivalent of the assassin’s mace with its high-altitude electromagnetic pulse (HEMP) weapons, which are nuclear warheads detonated at a high altitude, emitting intense electromagnetic radiation capable of disabling and destroying electronics. An assassin’s mace weapon possesses two essential characteristics: strategic surprise and the ability to neutralize a core dependency.
Chinese President and Chinese Communist Party (CCP) Chairman Xi Jinping (習近平) said in a politburo speech late last month that his party must protect the “bottom line” to prevent systemic threats. The tone of his address was grave, revealing deep anxieties about China’s current state of affairs. Essentially, what he worries most about is systemic threats to China’s normal development as a country. The US-China trade war has turned white hot: China’s export orders have plummeted, Chinese firms and enterprises are shutting up shop, and local debt risks are mounting daily, causing China’s economy to flag externally and hemorrhage internally. China’s
During the “426 rally” organized by the Chinese Nationalist Party (KMT) and the Taiwan People’s Party under the slogan “fight green communism, resist dictatorship,” leaders from the two opposition parties framed it as a battle against an allegedly authoritarian administration led by President William Lai (賴清德). While criticism of the government can be a healthy expression of a vibrant, pluralistic society, and protests are quite common in Taiwan, the discourse of the 426 rally nonetheless betrayed troubling signs of collective amnesia. Specifically, the KMT, which imposed 38 years of martial law in Taiwan from 1949 to 1987, has never fully faced its
When a recall campaign targeting the opposition Chinese Nationalist Party (KMT) legislators was launched, something rather disturbing happened. According to reports, Hualien County Government officials visited several people to verify their signatures. Local authorities allegedly used routine or harmless reasons as an excuse to enter people’s house for investigation. The KMT launched its own recall campaigns, targeting Democratic Progressive Party (DPP) lawmakers, and began to collect signatures. It has been found that some of the KMT-headed counties and cities have allegedly been mobilizing municipal machinery. In Keelung, the director of the Department of Civil Affairs used the household registration system