The rise of artificial intelligence (AI) is undeniable, but a new report in The New Yorker sheds light on a potential hidden cost: its massive power demands. According to a report in the New Yorker, the popular chatbot ChatGPT, developed by OpenAI, uses a staggering 500,000 kilowatt-hours of electricity every day to process the requests of its 200 million users. This equates to more than 17,000 times the electricity that the average American household uses in one day.
dangerous numbers
What makes these numbers even more alarming is that widespread adoption of AI technology could lead to even greater energy waste. According to a study published in Joule magazine by Alex de Vries, a data scientist at the Dutch National Bank, if Google were to integrate generative AI into all searches, it would consume a mind-boggling 29 billion kilowatt-hours of electricity per year. There is a possibility of consumption. This is more than the annual energy consumption of entire countries such as Kenya, Guatemala and Croatia.
“AI is very energy-intensive,” de Vries told Business Insider. “A single AI server can already consume as much electricity as a dozen UK homes combined.”
The Verge reports that estimating the AI industry's total energy consumption is difficult due to the changing operational needs of large-scale models and the secrecy surrounding tech giants' energy usage.
But de Vries used data from chipmaker Nvidia, a leader in the AI boom, to make a prediction. By 2027, the entire AI sector could use a staggering 85 to 134 terawatt-hours of electricity per year, amounting to 0.5 percent of global electricity consumption. This is important considering that major companies like Samsung use some of this for their entire business.
OpenAI has not yet commented on these reports.
There is growing concern about the environmental impact of AI's energy demands. As AI develops, addressing its energy consumption will be critical to ensuring a sustainable future.
dangerous numbers
What makes these numbers even more alarming is that widespread adoption of AI technology could lead to even greater energy waste. According to a study published in Joule magazine by Alex de Vries, a data scientist at the Dutch National Bank, if Google were to integrate generative AI into all searches, it would consume a mind-boggling 29 billion kilowatt-hours of electricity per year. There is a possibility of consumption. This is more than the annual energy consumption of entire countries such as Kenya, Guatemala and Croatia.
“AI is very energy-intensive,” de Vries told Business Insider. “A single AI server can already consume as much electricity as a dozen UK homes combined.”
The Verge reports that estimating the AI industry's total energy consumption is difficult due to the changing operational needs of large-scale models and the secrecy surrounding tech giants' energy usage.
But de Vries used data from chipmaker Nvidia, a leader in the AI boom, to make a prediction. By 2027, the entire AI sector could use a staggering 85 to 134 terawatt-hours of electricity per year, amounting to 0.5 percent of global electricity consumption. This is important considering that major companies like Samsung use some of this for their entire business.
OpenAI has not yet commented on these reports.
There is growing concern about the environmental impact of AI's energy demands. As AI develops, addressing its energy consumption will be critical to ensuring a sustainable future.