While the rise of artificial intelligence (AI) could revolutionize numerous sectors and unlock unprecedented economic opportunities, its energy intensity has raised serious environmental concerns.
In response, tech companies promote frugal AI practices and support research focused on reducing energy consumption, but this approach falls short of addressing the root causes of the industry’s growing demand for energy.
Developing, training and deploying large language models (LLMs) is an energy-intensive process that requires vast amounts of computational power. With the widespread adoption of AI leading to a surge in data centers’ electricity consumption, the International Energy Agency projects that AI-related energy demand would double by next year.
Data centers already account for 1 to 2 percent of global energy consumption — about the same as the entire airline industry. In Ireland, data centers accounted for 21 percent of total electricity consumption in 2023. As industries and citizens shift toward electrification to reduce greenhouse gas emissions, rising AI demand places enormous strain on power grids and the energy market.
Unsurprisingly, Ireland’s grid operator, EirGrid, has imposed a moratorium on new data center developments in Dublin until 2028. Countries such as Germany, Singapore and China have also imposed restrictions on new data center projects.
To mitigate the environmental impact of emerging technologies, the tech industry has begun to promote the concept of frugal AI, which involves raising awareness of AI’s carbon footprint and encouraging end users — academics and businesses — to select the most energy-efficient model for any given task.
However, while efforts to promote more conscious AI use are valuable, focusing solely on users’ behavior overlooks a critical fact: suppliers are the primary drivers of AI’s energy consumption.
Currently, factors like model architecture, data center efficiency and electricity-related emissions have the greatest impact on AI’s carbon footprint.
In addition, as technology evolves, individual users would have even less influence on its sustainability, especially as AI models become increasingly embedded within larger applications, making it harder for end users to discern which actions trigger resource-intensive processes.
These challenges are compounded by the rise of agentic AI — independent systems that collaborate to solve complex problems. While experts see this as the next big thing in AI development, such interactions require even more computational power than today’s most advanced LLMs, potentially exacerbating the technology’s environmental impact.
Moreover, shifting the responsibility for reducing AI’s carbon footprint to users is counterproductive, given the industry’s lack of transparency. Most cloud providers do not yet transparently disclose emissions data specifically related to generative AI, making it difficult to assess the environmental impact of their AI use.
A more effective approach would be for AI providers to provide consumers with detailed emissions data. Increased transparency would empower users to make informed decisions, while encouraging suppliers to develop more energy-efficient technologies.
With access to emissions data, consumers could compare AI applications and select the most energy-efficient model for a specific task. Businesses could also more easily choose a traditional information technology solution over an energy-intensive generative AI system if the overall impact is clear from the beginning. By working together, AI companies and consumers could balance AI’s potential benefits with its environmental costs.
To be sure, frugal AI might lead to some efficiency gains, but it does not address the core problem of AI’s insatiable energy demand. By providing greater transparency about energy consumption, sharing comprehensive emissions data and developing standardized metrics for AI models, companies could help clients optimize their carbon budgets and adopt more sustainable practices.
The automotive industry offers a useful model for increasing energy transparency in AI development. By labeling the energy efficiency of their vehicles, auto manufacturers allow buyers to make more sustainable choices. Generative AI providers could adopt a similar approach and establish standardized metrics to capture the environmental impact of their models.
One such metric could be electricity consumption per token, which quantifies the amount of energy required for an AI model to process a single unit of text.
Just as fuel-efficiency standards allow car buyers to compare different models and hold manufacturers accountable, businesses and individual users need reliable tools to evaluate the environmental impact of AI models before deploying them.
By introducing transparent metrics, technology companies could not only steer the industry toward more sustainable innovation, but also ensure that AI helps combat climate change instead of contributing to it.
Boris Ruf is Research Scientist Lead at AXA.
Copyright: Project Syndicate
I came to Taiwan to pursue my degree thinking that Taiwanese are “friendly,” but I was welcomed by Taiwanese classmates laughing at my friend’s name, Maria (瑪莉亞). At the time, I could not understand why they were mocking the name of Jesus’ mother. Later, I learned that “Maria” had become a stereotype — a shorthand for Filipino migrant workers. That was because many Filipino women in Taiwan, especially those who became house helpers, happen to have that name. With the rapidly increasing number of foreigners coming to Taiwan to work or study, more Taiwanese are interacting, socializing and forming relationships with
Whether in terms of market commonality or resource similarity, South Korea’s Samsung Electronics Co is the biggest competitor of Taiwan Semiconductor Manufacturing Co (TSMC). The two companies have agreed to set up factories in the US and are also recipients of subsidies from the US CHIPS and Science Act, which was signed into law by former US president Joe Biden. However, changes in the market competitiveness of the two companies clearly reveal the context behind TSMC’s investments in the US. As US semiconductor giant Intel Corp has faced continuous delays developing its advanced processes, the world’s two major wafer foundries, TSMC and
We are witnessing a sea change in the government’s approach to China, from one of reasonable, low-key reluctance at rocking the boat to a collapse of pretense over and patience in Beijing’s willful intransigence. Finally, we are seeing a more common sense approach in the face of active shows of hostility from a foreign power. According to Article 2 of the 2020 Anti-Infiltration Act (反滲透法), a “foreign hostile force” is defined as “countries, political entities or groups that are at war with or are engaged in a military standoff with the Republic of China [ROC]. The same stipulation applies to
The following case, which I experienced as an interpreter, illustrates that many issues in Taiwan’s legal system originate from law enforcement personnel. The problem stems not so much from their education and training, but their personal attitude — characterized by excessive self-confidence paired with a lack of accountability. One day at 10:30am, I was called to a police station in New Taipei City for an emergency. I arrived an hour later. A man was tied to a chair, having been arrested at the airport due to an outstanding arrest warrant. It quickly became apparent that the case was related to