The question this post is trying to answer is “Should I boycott ChatGPT or limit how much I use it for the sake of the climate?” and the answer is a resounding and conclusive “No.”
It’s not bad for the environment if you or any number of people use ChatGPT, Gemini, Claude, Grok, or other large language model (LLM) chatbots. You can use ChatGPT as much as you like without worrying that you’re doing any harm to the planet. Worrying about your personal use of ChatGPT is wasted time that you could spend on the serious problems of climate change instead.
I’ve received some questions and comments about the environmental impact of AI chatbots and imagebots, so I was pleased to find Masley’s detailed analysis. If you’re interested in the topic, the linked article is just the cheat sheet for a longer 9,000-word article on the topic, and he has also responded to criticisms of his posts.
His conclusion is that the amount of energy and water consumed by a ChatGPT prompt, while real when multiplied across 1 billion prompts per day, isn’t worth worrying about compared to other activities. One ChatGPT prompt is roughly equivalent to running a vacuum cleaner for 10 seconds or a laptop for 3 minutes—or possibly even ten times less. Masley calculates that, on a daily basis, the average American uses enough energy for 10,000 ChatGPT prompts and consumes enough water for 24,000–61,000 prompts. When it comes to addressing climate change, we would be better served to focus on systemic change, or at least other personal lifestyle decisions.
While there is no requirement for anyone to use a chatbot or AI answer engine, if you are exploring their capabilities, you can do so without worrying that they will add to your existing climate footprint.
I think Masley’s Substack posts point out a lot of the challenges humans face in thinking about complex issues, especially when one holds a strong, emotionally-based opinion. I also think the hostility towards expertise that has emerged over the last decade or so in the US is a gigantic impediment to critical thinking.
For generative AI, I think a confluence of anti-technology views, anti-business views, and innumeracy lead to a lot of the “ChatGPT is destroying the environment” rhetoric. Further, the tendency of many to only look at single aspects of intricate systems, such as the manufacturing and use of glass and metal water bottles and Amazon’s logistics operations, can also lead to emotionally attractive but marginally effective calls to action.
Electron generation (electricity) DOES have an environmental impact no matter how it’s generated. So, conversing with an AI pollutes. Just like my reply.
The 13W LED bulbs in my office use power, and less will be used if I turn them off when I step out for a few minutes. But my home’s HVAC system consumes orders of magnitude more power. It’s highly unlikely that the amount saved by avoiding lightbulbs altogether would even be noticeable.
Funny enough: I’ve fed ChatGPT the article above and asked it about its energy consumption compared to the use of Google or Ecosia. And guess what ChatGPT told me
( The conclusion comes after the comparison ! )
TL;DR: chatgpt uses significantly more energy and water the “normal” search engines
Comparison: ChatGPT vs. Google/Ecosia – Resource Use per Query
Aspect
ChatGPT
Google (traditional)
Ecosia
Energy Consumption
0.001–0.01 kWh (depending on model size)
~0.0003 kWh (~30x less than GPT)
Similar to Google, possibly slightly higher (uses green energy)
CO₂ Emissions
2.5–5 g CO₂
~0.2 g CO₂ (~20–25x less)
Similar, but offset by tree planting
Water Use
Up to 500 ml per prompt
~1–10 ml (~50–500x less)
Comparable to Google
Type of Response
Text synthesis, contextual understanding
Links to web content, no deep processing
Like Google, with environmental focus
What Does This Mean?
A single ChatGPT prompt uses:
About 30 times more energy than a Google search
20–25 times more CO₂ emissions
50–500 times more water
Google and Ecosia are much more efficient, since they mostly just retrieve and rank existing pages, while ChatGPT activates a large-scale neural network requiring substantial computation.
What About Ecosia?
Ecosia:
Uses 100% renewable energy
Is transparent about sustainability
Plants trees using most of its profits (over 190 million trees so far)
There’s still some energy and water use per search, but it’s actively offset, making it far more climate-friendly than AI models like ChatGPT.
## Conclusion
If you’re looking for quick facts or an overview of a topic:
Google or Ecosia are much more eco-friendly.
ChatGPT is best used for complex, creative, or contextual tasks—but that comes with a higher environmental cost.
Perhaps rerun your prompt to ask for comparisons with everyday activities to put the energy and water into context. 30 times nearly nothing is still nearly nothing.
Personally, since researching organizations’ offset programs is similar to assessing how non-profits use donations and takes a similar level of effort, I rarely use offsets as a single determining factor in use/don’t use decisions. For example, if I need to fly somewhere and Airline A and Airline B both offer offsets, I’m simply going to choose the flight that fits my schedule the best.
I was trying to figure out energy usage recently. I turned to ChatGPT.
My comparison:
energy used to train an AI model
energy used to manufacture iPhones
I haven’t checked these numbers, but:
To put this into perspective, training a large AI model like GPT-3 consumes around 1,287,000 kilowatt-hours (kWh) of energy. Manufacturing a single iPhone uses approximately 70 kWh. Therefore, the energy required to train GPT-3 is equivalent to producing about 18,385 iPhones.
Given that Apple can produce up to 500,000 iPhones daily at a single facility, the energy used to train GPT-3 equates to less than 4% of a single day’s iPhone production at that plant. This comparison highlights the scale of energy consumption in different tech sectors.
Okay. But asking ChatGPT about itself is inherently hazardous, because it’s simply answering based on its LLM training. If there’s been a plethora of articles, posts, and back-of-the-envelope calculations that it has trained on, it will answer based on the higher probability that the string of phrases it’s following are plausible.
Not that it’s wrong, but mentioning “Ecosia” in your prompt certainly seemed to produce a response that could have been written by Ecosia’s PR team.
Which makes sense. Do a web search for “Ecosia”. The overwhelming majority of hits are from Ecosia’s own web sites. So I would expect an LLM to generate text mostly based on the content of those sites.
ChatGPT, on the other hand, has been the subject of countless news and review articles, so there is a lot more than marketing material for an LLM to draw from.
Not entirely true now that ChatGPT searches the Web too. I find that most of my usage triggers Web searches automatically, though occasionally I notice that it didn’t base on a search, so I tell it to try again with one, causing the results to change noticeably with real-world information.
I have my A/C set at 78F/25C in the summer and the heating at 68F/20C in the winter. I live in central AZ at 5,300 ft/1,615 meters elevation so I don’t get the triple digit Fahrenheit temperatures they get in the Valley, or the large amounts of snow they get up in Flagstaff.
AI, like everything else we rely on, uses energy—but it’s important to put that in perspective.
What often gets missed is that AI is already driving significant energy savings across sectors like transportation, logistics, manufacturing, and even energy production itself. From optimizing delivery routes and reducing fuel consumption to improving factory efficiency and helping balance power grids, AI is cutting waste in ways that scale.
There’s every reason to believe those savings will continue to accelerate—especially as AI systems become more efficient and begin optimizing their own operations.
The Economist recently published a couple of thoughtful articles on this topic, if you’re interested in digging deeper.
The language in your post is a bit too complex for the average reader to easily follow. So, I asked ChatGPT to rewrite it using simpler and more concise language—a common practice in the business world to improve clarity and accessibility.
“I think Masley’s Substack posts do a good job showing how hard it is for people to think clearly about complex topics—especially when they already have strong feelings about something. I also think that in the U.S., a growing distrust of experts over the past decade has made critical thinking even harder.”
When it comes to generative AI, I think a mix of anti-technology attitudes, distrust of big business, and poor understanding of numbers has fueled a lot of the “ChatGPT is ruining the planet” talk. Also, people often focus on just one part of a bigger system—like how glass or metal water bottles are made, or how Amazon delivers packages—which can lead to emotionally appealing but not very useful solutions.
I don’t know where chat bots end and AI in general begins. But there have been many reports of needing more energy generation to support growing AI use.
Adding power plants to meet the energy requirements of AI, implies that AI is a significant energy consumer.