
AI and Its Environmental Consequences: Can We Turn the Tide on Carbon Emissions?
Dr Yifei Zhang
11 December 2024
Nowadays, with the advancement of artificial intelligence (AI) technology in leaps and bounds, AI applications have permeated various aspects of human life—not only from smart assistants to autonomous driving technologies but also from industrial production to medical diagnosis. According to the International Data Corporation, the global AI market value is expected to rise from US$132.4 billion in 2022 to US$512.4 in 2027.
While the convenience of AI innovation is applauded by all sectors of society, does it also raise the community’s awareness that the technological revolution is subtly exerting a tremendous impact on the global environment? As a matter of fact, the problem of carbon emissions arising from the AI development process has reached such a state that it can no longer be ignored.
The invisible killer: the carbon footprint of AI training
To understand the impact of AI on the environment, it is necessary to unveil the true face of AI training models. The training process for modern AI models, particularly large language models, requires massive amounts of data and calculation resources. The latest research by the University of Massachusetts Amherst indicates that carbon emissions from training a large AI model can reach 626,000 pounds, equivalent to the total emissions from five vehicles throughout their entire life cycle, from production to disposal.
Specifically, approximately 552 tonnes of carbon dioxide are emitted during the training process of GPT-3 while the CO2 emitted from training the even larger model, GPT-4, is estimated to exceed 1,000 tonnes. Of particular concern is that these figures continue to go up. Under the sectoral consensus that “large models are the order of the day”, giant technology companies have been vying to develop even larger models, resulting in exponential surge in energy consumption. The AI sector’s carbon emissions are forecast to account for 3.5% of the world’s total carbon emissions by 2030.
Data centres: an energy-guzzling beast in the AI era
The energy consumption of large AI models has now reached an alarming level. Data of the Stanford AI Laboratory shows that one single training session of GPT-3 typically uses 1,287 megawatt-hours of electricity, equivalent to all the power consumption of 3,000 Tesla electric cars each travelling 200,000 miles, emitting a total of 552 tonnes of carbon dioxide.
In daily use, every response generated by ChatGPT requires 2.96 watt-hours of electricity, almost 10 times that (0.3 watt-hour) for a standard Google search. Each Google search powered by AI even utilizes 8.9 watt-hours. The water resource consumption level is also alarming. During its training, GPT-3 consumes close to 700 tonnes of water. For every 20 to 50 questions, 500 millilitres of water are required. For cooling of its data centres alone, Meta used over 2.6 million cubic metres of water in 2022.
Root causes of escalating energy consumption
The colossal energy consumption of large AI models can mainly be attributed to two core factors. First, the rapid iterations of AI technology have significantly stimulated the demand for chips, directly pushing up electricity consumption. The training and inference processes of modern AI models deploy enormous computational resources, which primarily rely on high-performance hardware, including graphics processing units and application-specific integrated circuits. This hardware is highly energy-intensive when running complex computations. As AI models keep expanding in size, their computational capabilities have seen exponential growth, resulting in an ever-increasing demand for high-performance chips and, consequently, mounting energy consumption.
Furthermore, substantial computational power is essential for supporting the AI model training process. The around-the-clock data centres generate excessive heat, necessitating cooling treatments. Energy consumption is an especially severe issue for data centres, which serve as the core infrastructure for AI computation. Servers and storage devices running at high loads release a vast amount of heat. If the heat is not dissipated in time, both the performance and lifespan of the devices will be seriously compromised. Hence, data centres are equipped with super-efficient cooling systems to ensure that the devices operate at optimal temperatures.
In the operating cost structure of a data centre, electricity tariffs account for 60% of the total cost, of which over 40% is spent on cooling systems. At an air-cooling data centre in particular, more than 60% of electricity is used for cooling while less than 40% is used for computation. As a result of this energy utilization imbalance, the energy consumption of data centres around the world is now almost 10 times more than it was a decade ago. Traditional air-cooling systems are less costly but also less efficient, making them incapable of meeting the requirements for high-efficiency cooling. In comparison, except for a large-scale investment at the initial stage, liquid-cooling systems are more efficient, thus sharply reducing energy consumption at data centres.
In addition, the site selection and design of data centres have a significant impact on energy consumption. Many data centres are located in areas with lower electricity costs but in hot climates, placing a heavier burden on the cooling systems. To enhance energy utilization efficiency, priority should be given to locations with cooler temperatures and a stable energy supply. Besides, a modular design should be adopted so that resource allocation can be flexibly adjusted according to needs.
Finally, the training and inference processes of AI models also involve huge amounts of data transmission and storage, which inevitably boost energy consumption. As more and more data is created, data centres need extra storage devices and greater bandwidth to cope, which further expands energy consumption. These facts demonstrate that companies should make use of data compression and transmission optimization technologies to cut down energy consumption by minimizing unnecessary procedures.
Corporate solutions and policy suggestions
In the face of the environmental challenges from AI technology, companies and policy-makers need to take a series of carbon-reduction measures. First, businesses should maximize the use of green energy and energy-saving technologies. Investments should be made in renewable energy sources such as solar energy and wind energy to minimize reliance on traditional fossil fuels. Second, enterprises should optimize AI model training algorithm to streamline computation, cutting down energy consumption at source. Third, they should enhance data centre management and upgrade technologies; use high-efficiency solutions such as liquid-cooling to promote energy utilization efficiency; and minimize waste of idle resources through smart dispatch and load balancing. Fourth, through virtualization technology, companies can integrate computational resources to lower energy consumption.
In terms of policy-making, the government should first set strict energy efficiency standards and promote green development of AI technology; and, through tax concessions and funding, encourage enterprises to adopt energy-saving technologies and renewable energy sources. Second, regulation of data centres should be strengthened and energy efficiency evaluation standards should be established to promote overall energy efficiency. Third, governments and industry should join hands to spread environmental awareness among the public and businesses. The negative impact of AI technology on the environment can be minimized through such mechanisms as carbon trading and carbon offsets. Both education and publicity are indispensable, as only when the relationship between AI advancement and environmental protection is widely known can a social consensus be reached and concerted efforts be made to address the problems.
Glimmers of hope amid crisis
The trend of AI technology may well be overwhelming, but we must ensure that the environment will not be harmed as a result. Through technological innovation, corporate self-regulation, government guidance, and social oversight, environmental impact can be minimized while the convenience of AI can be enjoyed by all. The International Renewable Energy Agency predicts that, with proactive measures, the annual growth in carbon emissions by the AI industry can be controlled within 5% by 2030.
As witnesses and participants of this era, each and every one of us should be concerned about the environmental issues brought about by advancements in AI and take concrete actions to support its green development. Only through this approach can we ensure that the AI technology benefits mankind instead of becoming another burden on the Earth. In our quest for technological breakthroughs, environmental protection should be the bottom line that must be upheld, not just a token gesture. Let all sectors of the community make concerted efforts to drive AI towards a greener and more sustainable future.
Through policy guidance, technological innovation, and public engagement, the path will be paved for Hong Kong to achieve the AI industry’s carbon-neutral goals by 2035 and contribute to the sustainable development of the world.