The increase in applications with artificial intelligence is increasing rapidly – and with it power consumption is also increasing. What solutions are there for this?
What is it about? Artificial intelligence (AI) is intended to help where humans are too slow, the work is too boring or the machine is more accurate. But when using AI, it is often forgotten that training and operating AI computer systems require enormous amounts of energy. Because computing power always requires electricity. And the more complex the calculations are, the more power the servers and computers need. And: AI calculations are always complex – after all, the user expects a specific, detailed and possibly personalized answer to a specific question.
The smarter an answer needs to be, the greater the computing power required and thus the greater the power consumption.
Why so much electricity? Every single query from a user, for example in ChatGPT, consumes a lot of power – after all, this triggers extensive computing operations on possibly dozens of servers. But before a system is even ready to spit out the smartest possible answers, it has to be trained. “In order to train a language model, for example, it has to carry out calculations with thousands of billions of words,” says SRF digital editor Guido Berger. And: “The smarter an answer has to be, the greater the computing power required and thus the greater the power consumption.” The power consumption is even higher for photo or video applications.
It is completely unclear whether AI providers will ever earn anything at all.
Who pays for this electricity? “AI providers are currently burning their investment money in their data centers,” notes Guido Berger. It is still completely open whether providers will one day be able to charge users for electricity consumption – for example by making inquiries cost something. “It is also unclear whether AI providers will ever earn anything at all,” says Berger. All of these uncertainties make it difficult to make predictions about the number of queries in the future – and thus the future power consumption of AI. Because it seems clear that the more expensive it becomes for the user, the fewer requests there will be for AI services.
Extremely high power consumption
A Dutch scientist has calculated that just running the best-known AI application ChatGPT consumes as much electricity as 40,000 households. And that is just one of the current, increasingly unmanageable AI offerings. Therefore, no one really knows how much electricity is used in total for AI. But experts are certain that consumption will rise rapidly – worldwide. According to expert Ralf Herbrich from the Hasso Plattner Institute in Potsdam, all computers in the world, including data centers, currently consume around eight percent of all electricity produced. “There are estimates that consumption could increase to 30 percent in the next few years,” Herbrich told the dpa news agency.
Reduce power consumption? Because providers of AI services have to pay for the computing power and electricity themselves, they have a great interest in reducing this. Accordingly, there is a lot of research being done in the area of efficiency. One way could be to use smaller AI language models. You would use much less training data, computing power and therefore electricity. Condition: The smaller models would still have to be good enough to be able to perform their specific application. Berger also believes it is unrealistic that electricity consumption will increase as massively as a result of AI, as some experts predict. Because: “Nobody could pay for that anymore.”