- ChatGPT uses more than 500,000 kilowatt hours of electricity every day, The New Yorker reported.
- By comparison, the average U.S. home uses just 29 kilowatt-hours.
- It is difficult to estimate how much electricity the burgeoning AI industry will consume.
AI consumes a lot of power.
ChatGPT, OpenAI's buzzing chatbot, is probably According to The New Yorker, it consumes more than 500,000 kilowatt hours of electricity to service about 200 million requests a day.
The publication reported that the average U.S. household uses about 29 kilowatt-hours each day. If you divide the amount of electricity that ChatGPT uses in a day by the average household usage, you will find that ChatGPT uses more than 17,000 times more electricity than him.
That's a lot. And as generative AI evolves further, If adopted, the amount of water discharged can be significantly increased.
For example, calculations made by Alex de Vries, a data scientist at the Dutch National Bank, in a paper on sustainable energy show that if Google were to integrate generated AI technology into all searches, it would generate around 29 billion kilowatt-hours per year. will be consumed. Magazine Joule. That's more electricity than countries like Kenya, Guatemala and Croatia consume in a year, according to The New Yorker.
“AI is very energy-intensive,” de Vries told Business Insider. “Each one of these AI servers could already consume as much power as over 12 of his homes in the UK combined. So that number will grow very quickly.”
Still, it's difficult to estimate how much electricity the burgeoning AI industry will consume. There is considerable variation in how large-scale AI models work, and the Big Tech companies that have driven the boom have yet to change. According to The Verge, they're being very upfront about their energy use.
However, in his paper,, De Vries based his rough calculations on numbers released by Nvidia, which some have dubbed the “Cisco” of the AI boom. The chipmaker has about 95% of the graphics processor market share, according to figures from New Street Research reported by CNBC.
In his paper, De Vries estimates that by 2027, the entire AI sector will consume between 85 and 134 terawatt-hours (1 billion kilowatt-hours) per year.
“What you're talking about is that AI power consumption could be 0.5 percent of global power consumption by 2027,” de Vries told The Verge. “I think this is a pretty significant number.”
Some of the world's biggest power users pale in comparison. According to BI calculations, Samsung uses nearly 23 terawatt-hours, while tech giants like Google use just over 12 terawatt-hours and Microsoft uses just over 10 terawatt-hours to run data centers, networks, and user devices. Masu. Based on a report from Consumer Energy Solutions.
OpenAI did not immediately respond to BI's request for comment.
On February 28, Axel Springer, the parent company of Business Insider, joined 31 other media groups in filing a $2.3 billion lawsuit against Google in Dutch court, alleging losses caused by the company's advertising practices. Ta.
Axel Springer, Business Insider's parent company, has a global deal that allows OpenAI to train models based on its media brands' reporting.