Why Using ChatGPT Is Not Bad for the Environment

The need for a lot more electricity (and therefore a more power plants) is vastly bigger than just AI. Some other huge examples:

  • People converting their homes from gas/oil heating to electric heating.
  • People replacing gas powered cars with EVs
  • People using large amounts of personal electronic devices.
  • Use of cloud/internet services for nearly everything, not just AI. All of that requires servers in data centers.

Note that many of these transitions (especially home heating and vehicles) are being strongly encouraged (and sometimes mandated) by laws and government regulations.

When combined with the fact that states (again, often because of laws and regulations) are shutting down coal, oil, gas and nuclear power plants, it should not be the least bit surprising that there’s a massive energy crisis coming. And the only solution (which very few people in government want to support) is to build new power plants and expand the grid.

AI, while possibly significant, isn’t going to be the cause of or the solution to this problem.

1 Like

I’m glad someone beat me to posting the MIT Technology Review article. It’s another good look at things, though they don’t seem to acknowledge that the industry actively wants to reduce power use as well. Or, rather, the industry actively wants the code to become more efficient because then they can spend less on power per prompt. My understanding from the API costs of models (how much it costs per prompt when you’re paying for each one) is that they have dropped radically over time.

Following the sentiment expressed in your post, here are two other ChatGPT rewrites that you may find more accessible:

Winston Churchill
It is my firm belief that Masley’s insightful posts on Substack illuminate the formidable challenges that humanity faces when confronted with complex issues, particularly when one clings tenaciously to strong, emotion-driven opinions. Furthermore, we must acknowledge the growing hostility towards expertise that has taken root in our society over the past decade—a hostility that stands as a colossal barrier to the exercise of critical thinking.

In the realm of generative AI, we find ourselves beset by a confluence of anti-technology sentiments, aversions to business, and a troubling lack of numerical literacy. This has given rise to the alarming rhetoric that “ChatGPT is wreaking havoc upon our environment.” Moreover, we observe a troubling tendency among many to focus solely on isolated aspects of intricate systems—be it the production and use of glass and metal water bottles or the complex logistics of Amazon. Such narrow perspectives may yield emotionally compelling arguments, yet they often result in actions that are, at best, marginally effective. Let us strive for a broader understanding, for it is only through comprehensive insight that we may navigate the challenges before us.

Quentin Tarantino
INT. COFFEE SHOP - DAY

The camera zooms in on a table where two friends, JACK and LUCY, sit sipping their coffee. JACK leans in, animated, as he talks about Masley’s Substack posts.

JACK
(leaning forward, intense)
You know what Masley’s been laying down on Substack? It’s like a spotlight on the mess we humans make when we tackle complex issues. I mean, when you’ve got a strong opinion fueled by emotion, it’s like trying to drive a car with no brakes. You just crash.

LUCY nods, intrigued, her eyes wide.

LUCY
Yeah, but what about the hostility? It’s like a plague.

JACK
Exactly! This hostility towards expertise that’s been brewing over the last decade in the U.S.? It’s a massive roadblock to clear thinking. Like, come on! We need experts, not just loud voices.

He takes a sip of his coffee, then leans back, crossing his arms.

JACK
And let’s talk about generative AI. It’s a cocktail of anti-tech vibes, anti-business rants, and a whole lot of people who can’t count. You hear the chatter? “ChatGPT is destroying the environment!” It’s like a bad movie line.

LUCY chuckles, shaking her head.

LUCY
Right? It’s all surface-level drama.

JACK
(gesturing wildly)
Exactly! People only see one piece of the puzzle. They’re fixated on glass and metal water bottles or Amazon’s logistics like it’s the whole story. But it’s not! It’s a tangled web, and they’re just pulling at one thread.

He leans in closer, eyes narrowing.

JACK
And those calls to action? They sound good, but they’re like a flashy car with no engine. Emotionally appealing but barely effective.

LUCY raises her coffee cup, a smirk on her face.

LUCY
To the tangled web, then.

They clink their cups together, the camera pulling back as they continue their animated discussion, the world around them fading into the background.

FADE OUT.

Some actual data from Sam Altman:

(Altman’s figures)

At a billion queries a day, that works out to 340 MW and
85,000 gallons of water per day.

8 posts were split to a new topic: Quoting in Discourse

Some context from a cursory look at search results:

Using water consumption data from the Commercial Buildings Energy Consumption Survey (CBECS), EIA estimates that the 46,000 [1] large commercial buildings (greater than 200,000 square feet) used about 359 billion gallons of water (980 million gallons per day) in 2012. This level represents an estimated 2.3% of the total public water supply in the United States [2]. On average, these buildings used 7.9 million gallons per building, 20 gallons per square foot, and 18,400 gallons per worker in 2012. On a daily basis, they used an average of 22,000 gallons per building, 55.6 gallons per thousand square feet, and 50.1 gallons per worker.
https://www.eia.gov/consumption/commercial/reports/2012/water/

On average each square meter of a hotel room uses around 0.55 kWh each day. So, the amount your hotel consumes will be based on the number and size of rooms. If you have a standard hotel room size (for example 91 M hotel with over 100 rooms could expect to be consuming around 50,000 kWh per day just for the hotel rooms.

Mistral AI have audited their water, etc, usage

https://bsky.app/profile/emollick.bsky.social/post/3luljwvstrs2d

1 Like

Yes this is true.

There are other factors that cannot be accounted for, such as “competitive” behavior and organizations that do whatever they want and claim different (or opposite). This is becoming clear with the X (Grok) supercomputer complex in Memphis, TN, although there were many warnings over the last year as they raced to build their system and “beat” the industry at any cost.

An article from yesterday’s Washington Post (Gift Link[1]) puts into perspective the climate impact of AI vs other aspects of daily computing:

You’d be hard-pressed to ask enough questions to ChatGPT, Perplexity or other AI services to meaningfully change your personal emissions. Asking AI eight simple text questions a day, every day of the year, adds up to less than 0.1 ounces of climate pollution, our data suggests. That’s 0.003 percent of the average American’s annual carbon footprint.

… The exception is AI-generated video: One five-second clip requires 944 Wh, equivalent to riding 38 miles on an e-bike.

Overall, our personal and work-related digital emissions are dominated by just three things: TV, digital storage and internet or video use on your computer.​ Why is TV such an energy hog? Americans tend to watch a lot of TV — 4.5 hours daily — and big screens suck up electricity for high-quality display and streaming, even when they’re in standby mode


Moreover, the climate impact of AI is insignificant relative to other sources of digital emissions:

Overall, our personal and work-related digital emissions are dominated by just three things: TV, digital storage and internet or video use on your computer.​ Why is TV such an energy hog? Americans tend to watch a lot of TV — 4.5 hours daily — and big screens suck up electricity for high-quality display and streaming, even when they’re in standby mode.


And digital emissions are insignificant relative to other personal choices:

This gets to the truism about where we, as individuals, can see the biggest bang for our emissions buck: what we eat, how we move around and how we heat or cool our homes.

Nothing in your digital life, for example, comes close to your commute. For the average car-driving American, going back and forth to work emits at least eight times as much as digital activities for work and personal combined (another argument against return to office).


  1. Articles accessed via Gift Links are free to read without a subscription, but require a Washington Post account, which is also free. ↩︎

AI usage has nothing on pets.

The average dog (they eats lots of meat, especially beef) renders about 770 kg CO2 annual emissions. A large dog about 1500 kg. For comparison, the average US car used for the average US distance comes in at 4600 kg. If you’d ask some AI/LLM two dozen queries every single day that would amount to 34 kg. A cat BTW still clocks in at 310 kg (indoor, worse if outdoor).

I don’t know about where you guys live, but here in the Bay pets are sacred. Nobody would forgo their dog because of climate change even though we’re usually good at talking a big game about it. So I do wonder when AI keeps being brought up in the context of GHG emissions, if this is more about something else than how we actually curb climate change.

(And just in case you’re truly concerned about the GHG emissions of those 24 queries every single day, next year just drive 85 miles less. That’s merely a quarter mile a day walking instead of driving. Done.

Time to put the kill switch on the TV outlet!

1 Like

Fine idea except that TVs are not progressing in that direction.

It’s unlikely folks will want to kill power to their “smart” TVs, if it then takes two plus minutes to turn it back on (booting Android and loading all the junkware etc.)

I’m having a hard time reconciling the “AI energy use is truly insignificant” with the claims by the big AI companies that they…

3 Likes

I think the incongruity is because building, training, and storing LLMs and generative AI’s is highly energy intensive but individual queries are less so.

Similarly, mining bitcoin is a huge energy suck but bitcoin transactions transaction queries are not.

(Thanks to @Shamino for refining my comparison)

1 Like

That’s true of a lot of environmental impacts – driving to the store to get milk is not a massive use of energy. 300 million people driving to the store is.

That’s a bit misleading.

Querying the Bitcoin blockchain to look up/verify transactions is lightweight.

But any transaction that puts another block on the chain (that is, every time bitcoins are transferred from one account to another), requires the mining process (including all of its overhead) in order to complete the transaction.

2 Likes

But then shouldn’t that energy use count? If the argument is “using AI isn’t very energy intensive so you shouldn’t worry about it,” then is it fair to say we’ll just ignore massive infrastructure costs that will have to be built out to support that use?

Is it okay to trade in your Prius for an Escalade with the emissions controls disabled because, after all, how much of a difference will one excessive driver and one particularly dirty vehicle make overall?

Just to clarify—I don’t want to debate whether LLMs/gen AI’s are “good” or “bad” based on power requirements—my intention was to address why companies including Meta and Microsoft can say they need new energy infrastructure while online commentators can also assert that usage of products such as ChatGPT and Perplexity is not as threatening to the environment as some people say.