Energy supplies could be holding back the next generation of artificial intelligence services.

The explosive growth of artificial intelligence (AI) is creating a new challenge for global electricity supplies, exacerbating the demand on already strained power grids. 

The issue has captured the attention of key players in the technology sector, who are voicing concerns about the sustainability of current energy resources.

Elon Musk, the billionaire tech mogul, has noted a shift in the industry's limiting factors, saying that while the development of AI had been “chip constrained” last year, the latest bottleneck is “electricity supply”. 

Similarly, Andy Jassy, CEO of Amazon, has raised alarms about the energy requirements for AI, saying; “there is not enough energy right now to run new generative AI services”. 

The demand for data centres, which house the infrastructure necessary for AI operations, is soaring to unprecedented levels. 

“Demand for data centres has always been there, but it’s never been like this,” says Pankaj Sharma, executive vice president at Schneider Electric’s data centre division.

The search for suitable locations for these power-hungry facilities is becoming increasingly difficult too. 

Daniel Golding, chief technology officer at Appleby Strategy Group, has noted the logistical challenges.

“One of the limitations of deploying [chips] in the new AI economy is going to be...where do we build the data centres and how do we get the power,” he said, adding that “at some point the reality of the [electricity] grid is going to get in the way of AI”. 

The escalating demand for electricity not only stresses the grid but also sparks concerns about the environmental impacts of such growth. 

As nations work towards renewable energy goals and the electrification of transportation to combat climate change, the added pressure from the technology sector could pose significant challenges.

The financial stakes involved in expanding data centre capabilities are monumental. 

Research group Dgtl Infra predicts that global data centre capital expenditure will surpass $350 billion in 2024. 

Nvidia's CEO, Jensen Huang, has further highlighted the scale of investment needed, estimating that “$US1 trillion worth of data centres would need to be built in the next several years to support generative AI”.