Tech companies are reporting increased emissions due to running the data centres that power artificial intelligence (AI).
But AI tools can also help facilitate the energy transition.
A multistakeholder approach, like The World Economic Forum’s Artificial Intelligence Governance Alliance, is vital to help balance AI’s resource use and benefits.
본 내용은 세계경제포럼이 분석하여 2024년 7월 22일 세계경제포럼 홈페이지에 게재하고 최근에 업데이트한 내용을 옮긴 것입니다.

How much energy does artificial intelligence (AI) use? Ask ChatGPT and this is what it says:
“AI systems vary widely in energy consumption depending on their complexity and usage, but they generally require significant amounts of electricity to process and analyse data efficiently.”
That response required around 10 times the electricity of a Google search, by some estimates. And with more than 400 million active users of ChatGPT every week, the extra energy demand starts to add up. And that’s just users on one platform. Another platform, China’s DeepSeek, recently created waves by claiming to be cheaper and much more energy efficient than rivals. But that’s since been called into question, leaving the issue of AI’s energy consumption an ongoing concern.
The data centres used to train and operate AI models consume much of this energy. A typical AI data centre, according to the International Energy Agency (IEA), uses as much power as 100,000 households right now, but the largest centres currently being constructed will consume 20 times that amount.
Tech firms have reported that the increasing energy demand from building and running these data centres is pushing up their global greenhouse gas (GHG) emissions. Microsoft, which has invested in ChatGPT maker OpenAI and has positioned generative AI tools at the heart of its product offering, announced in May 2024 that its CO2 emissions had risen nearly 30% since 2020 due to data centre expansion. Google’s 2023 GHG emissions were almost 50% higher than in 2019, largely due to the energy demand tied to data centres.
So, while AI tools promise to help the energy transition, they also require significant emissions-intensive computing power – and that need is only likely to grow.
What’s driving AI’s energy demand?
AI’s energy use currently only represents a fraction of the technology sector’s power consumption, which is estimated to generate around 2-3% of total global emissions. This is likely to change as more companies, governments and organizations use AI to drive efficiency and productivity. Data centres are already significant drivers of electricity demand growth in many regions.

Image: IEA
As these systems gain traction and develop even further, training and running the models will drive an exponential increase in the number of data centres needed globally – and the associated energy use. The electricity used by data centres is expected to more than double by 2030, surpassing Japan’s total consumption right now, according to the IEA. It says demand for other digital services will account for some of this, but AI is “the most important driver”.
This will put increasing pressure on already-strained electrical grids.

Image: Artificial Intelligence’s Energy Paradox: Balancing Challenges and Opportunities, the AI Governance Alliance, WEF, Accenture
Training generative AI, in particular, is extremely energy intensive and consumes much more electricity than traditional data-centre activities. As one AI researcher explained: “When you deploy AI models, you have to have them always on. ChatGPT is never off.”
The growth in sophistication of a large language model, such as the one on which ChatGPT is built, illustrates this escalating demand for energy.
Training a model such as OpenAI’s Generative Pre-trained Transformer 3 (GPT-3) is estimated to use just under 1,300 megawatt hours (MWh) of electricity. This is roughly equivalent to the annual power consumption of 130 homes in the US. Training the more advanced GPT-4, meanwhile, is estimated to take 50 times more electricity.
Overall, the computational power needed for sustaining AI’s growth is doubling roughly every 100 days.
How can the AI industry improve its energy efficiency?
This leaves society wrestling with some thorny questions. Do the economic and societal benefits of AI outweigh the environmental cost of using it? And more specifically, do the benefits of AI for the energy transition outweigh its increased energy consumption?
Recent IEA research states that data centres are among the fastest-growing sources of emissions globally, but also that these emissions will remain below 1.5% of the total for the energy sector between now and 2035. In fact, the IEA says: “The widespread adoption of existing AI applications could lead to emissions reductions that are far larger than emissions from data centres – but also far smaller than what is needed to address climate change.”
Other reports estimate that AI could help mitigate 5-10% of global GHG emissions by 2030. So, while the IEA believes AI could be “a tool in reducing emissions”, it’s not “a silver bullet” that can replace proactive policy on decarbonization.

Image: OpenAI
Finding the sweet spot between the challenges and opportunities of AI will be key to getting the answers we need. What needs to happen to strike the right balance?
Regulators including the European Parliament are beginning to establish requirements for systems to be designed with the capability of logging their energy consumption. And improvements in technology could help address AI’s energy demand, with more advanced hardware and processing power expected to improve the efficiency of AI workloads.
Researchers are designing specialized hardware such as new accelerators, new technologies such as 3D chips, which offer much-improved performance, and new chip cooling techniques. Computer chip maker Nvidia claims its new ‘superchip’ can deliver a 30 times performance improvement when running generative AI services, while using 25 times less energy.
Data centres, too, are becoming more efficient. And new cooling technologies and sites that can perform more computations when power is cheaper, more available and more sustainable are being explored to push this efficiency further.
Alongside this, reducing overall data usage – including addressing the issue of dark data, which is data generated and stored but then never used again – will be important. Being more selective about how and where AI is used will also help, for example using small language models, which are less resource intensive, for specific tasks. Finding a better balance between performance, costs and the carbon footprint of AI workloads will be key.
What about AI’s impact on the electrical grid?
AI is not the only factor applying pressure to the grid. The energy needs of growing populations and trends towards electrification are creating increased demand that could lead to slower decarbonization of the grid.
Yet a clean, modern and decarbonized grid will be vital in the broader move to a net-zero emissions economy. Although coal is currently the largest source of electricity supplying data centres, according to the IEA, the good news is that renewable energy will be the fastest-growing electricity source over the next five years. It’s set to meet nearly 50% of the growth in data centre power demand.
Data centre operators are also exploring alternative power options, such as nuclear technologies , to power sites or storage sources technologies like hydrogen, to power sites. And cCompanies are also investing in emerging tech such as carbon removal to suck CO2 out of the air and store it safely.
AI can have a role in overcoming barriers to integrating the necessary vast amounts of renewable energy into existing grids, too. The variability of renewable energy often results in overproduction and underproduction depending on the weather. This leads to wasteful energy consumption and grid instability.
But by analyzing vast datasets – from weather patterns to energy consumption trends – AI can forecast energy production with remarkable accuracy. This could enable job scheduling and load shifting to make sure data centres use energy when electricity from renewable energy sources is available, ensuring optimal grid stability, efficiency and 24/7 clean power.
AI is also helping to transform the energy efficiency of other carbon-intensive industries. A recent white paper from the World Economic Forum’s AI Governance Alliance, Artificial Intelligence’s Energy Paradox: Balancing Challenges and Opportunities, outlines a range of AI-enabled opportunities to cut electricity use in various industries. Modelling buildings’ energy use and optimizing the performance of heating and air conditioning is one option, as is improving the efficiency of manufacturing through predictive maintenance. In transport, AI-driven vehicle improvements could cut energy consumption by up to 20%, while in agriculture, sensors and satellite imagery are helping to predict crop yields and manage resources.
Balancing AI’s energy use and emissions with its societal benefit takes in many complex, interlinked challenges, and requires a multistakeholder approach. The World Economic Forum’s Artificial Intelligence Governance Alliance is applying a cross-industry and industry-specific lens to understand how AI can be leveraged to transform sectors and drive impact on innovation, sustainability and growth.
As part of this initiative, the Forum’s Centre for Energy and Materials and Centre for the Fourth Industrial Revolution are launchinghave launched a dedicated workstream to explore the energy consumption of AI systems and how AI can be leveraged as an enabler for the energy transition.
AI undoubtedly has the capacity to create major economic and societal benefits for the world. Striking the ideal balance between the challenges and opportunities of this technology will be crucial to ensuring these benefits outweigh the costs – both economic, social and environmental – of using AI.
Have you read?
- How to manage AI’s energy demand — today, tomorrow and in the future
- Digital solutions can reduce global emissions by up to 20%. Here’s how
- 4 ways AI can super-charge sustainable development