Artificial intelligence, crypto mining, and the cloud are driving data centre electricity consumption to new unprecedented heights.

Data centres’ rising power consumption has been a contentious subject for several years at this point. 

Countries with shaky power grids or without sufficient access to renewables have even frozen their data centre industries in a bid to save some electricity for the rest of their economies. Ireland, the Netherlands, and Singapore have all grappled with the data centre energy crisis in one way or another. 

Data centres are undeniably becoming more efficient, and supplies of renewable energy are increasing. Despite these positive steps, however, the explosion of artificial intelligence (AI) adoption in the last two years has thrown the problem into overdrive. 

The AI boom will strain power grids

By 2027, chip giant NVIDIA will ship 1.5 million AI server units annually. Running at full capacity, these servers alone would consume at least 85.4 terawatt-hours of electricity per year. This is more than the yearly electricity consumption of most small countries. And NVIDIA is just one chip company. The market as a whole will ship far more chips each year. 

This explosion of AI demand could mean that electricity consumption by data centres doubles as soon as 2026, according to a report by the International Energy Agency (IEA). The report notes that data centres are significant drivers of growth in electricity demand across multiple regions around the world. 

In 2022, the combined global data centre footprint consumed approximately 460 terawatt-hours (TWh). At the current rate, spurred by AI investment, data centres are on track to consume over 1 000 TWh in 2026. 

“This demand is roughly equivalent to the electricity consumption of Japan,” adds the report, which also notes that “updated regulations and technological improvements, including on efficiency, will be crucial to moderate the surge in energy consumption.”

Why does AI increase data centre energy consumption? 

All data centres comprise servers, cooling equipment, and the systems necessary to power them both. Advances like cold aisle containment, free-air cooling, and even using glacial seawater to keep temperatures under control have all reduced the amount of energy demanded by data centres’ cooling systems. 

However, while the amount of energy cooling systems use related to the overall power draw has remained stable (even going down in some cases), the energy used by computing has only grown. 

AI models consume more energy than more traditional data centre applications because of the vast amount of data that the models are trained on. The complexity of the models themselves and the volume of requests made to the AI by users (ChatGPT received 1.6 billion visits in December of 2023 alone) also push usage higher. 

In the future, this trend is only expected to accelerate as tech companies work to deploy generative AI models as search engines and digital assistants. A typical Google search might consume 0.3 Wh of electricity, and a query to OpenAI’s ChatGPT consumes 2.9 Wh. Considering there are 9 billion searches daily, this would require almost 10 TWh of additional electricity in a year. 

  • Data & AI
  • Infrastructure & Cloud

Related Stories

We believe in a personal approach

By working closely with our customers at every step of the way we ensure that we capture the dedication, enthusiasm and passion which has driven change within their organisations and inspire others with motivational real-life stories.