The Cost of One Chat to the Environment GPT
Question Is Totally Crazy
You are going to be shocked by the replies.
How much resources are used when you ask ChatGPT at OpenAI to compose a basic email of 100 words?
You may be concerned about the response. According to UC Riverside researcher Shaolei Ren, the solution uses about the same amount of energy as an hour’s worth of 14 LED lights and a full bottle of water, as reported by The Washington Post. Even more shocking is the fact that this implies a substantial environmental cost given the sheer volume of users globally.
Let’s say that 10% of American workers used ChatGPT at least once a week to compose emails. According to Ren’s calculations, over the course of a year, ChatGPT would use 435 million liters of water and 121,517 megawatt-hours of electricity.
That’s equivalent to all the water consumed by every Rhode Island household for a day and a half and all the electricity needed to light up every home in Washington, DC, for twenty days.
And that’s only the use as of today. Such numbers may get absurdly low given that big tech is so certain of AI’s explosive potential that Microsoft wants to restart a nuclear power plant in order to power its AI datacenters.
AI data centers generate a lot of heat when they perform computations, which is why ChatGPT uses so much water.
These facilities need a lot of water to cool down in order to lower the temperatures emanating from these servers. AI data centers utilize power to run air conditioners to cool their servers in areas with inexpensive electricity or limited water supplies.
That can put a strain on the system. The conflict between meeting the demands of the general public and the AI data centers’ ravenous need for power and water which drives economic growth and tax revenue is already evident in states like Arizona and Iowa.
Discover more from Postbox Live
Subscribe to get the latest posts sent to your email.