Training AI uses as much power as a commercial jet, and it’s getting worse

Where is the line?

by Lewis White
Halo cortana angry training AI artificial intelligence

As artificial intelligence becomes more engrained into our daily lives, the technology’s environmental impact is becoming a concern. Nowadays, there’s an AI program for anything from upscaling images to turning cartoon characters into real people. However, training AI to perform those unique tasks is resource intensive.

The cost of training AI

In a report by The Register, it’s said that improving artificial intelligence is rapidly becoming more intensive. Citing a research paper by the University of Massachusetts, The Register writes that training AI is taking up more resources, and money, every single year.

The paper’s researchers note that the most popular AI model, OpenAI, is monstrously eating up resources as it grows. Every year, the resources required for OpenAI training increase “by a factor of 10”.

On the other hand, Google’s BERT AI is also a resource hog. The same researchers discovered that GPU training the AI program generated as much carbon emissions “as a trans-American jet flight”. However, as training continues, the AI’s hunger for more power grows.

WATCH: Tobey Maguire and Andrew Garfield No Way Home video leaks online

Constantly increasing

As The Register notes, the needs of AI models are always increasing. As AIs use machine learning to improve, they’re constantly requiring more information to be processed. In order to become better, larger datasets are needed which requires more power. As trained models get bigger, even transferring those models can become power intensive.

In response to the growing needs of AI, computer science portrait professor at Boston University, Kate Saenko, explained:

“The general trend in AI is going in the wrong direction for power consumption. You might train a much bigger network and then distil it into a smaller one so that you can deploy a smaller network and save computation and deployment at inference time”

Read More: SpaceX launch site deemed a risk to Texas’s protected ecosystem, and Musk doesn’t care


The capitalist nightmare of training AI

As training AI becomes more intensive, researchers believe that smaller teams won’t be able to keep up. While mega-corporations like Google, Tesla, Facebook and Microsoft have deep pockets to find their neural nets, smaller teams have as little as one PC.

Weights and Biases CEO Lukas Biewald warns that massive companies will be the only ones with usable AIs in the future. He says:


“You have to buy the energy to train these models, and the only people that can realistically afford that will be Google and Microsoft and the 100 biggest corporations.”

Read More: Jeff Bezos plans to fight death with anti-aging research

Lewis White