Google Reveals the Shocking Environmental Cost of a Single AI Prompt — and It’s (Probably) Not What You Might Think
Google reveals the water, energy and carbon footprint of every Gemini AI prompt

For the first time, Google has publicly shared exactly how much energy, water and carbon go into a single AI prompt for Gemini. At first glance, the numbers sound shockingly…small?
According to Google’s newly released data, the median text prompt to Gemini consumes a mere 0.24 watt-hours of electricity. That’s the same amount you’d use watching TV for less than nine seconds (about 0.03 grams of CO₂, and 0.26 milliliters of water, roughly five drops).
But, that’s now the whole story. In the same report, Google also showed dramatic efficiency gains: a 33x drop in energy consumption and a 44x reduction in carbon footprint per prompt over the past year, all while improving Gemini’s response quality. But despite those impressive numbers, the bigger picture is that AI is straining the environment in ways that can’t be ignored.
What this means for everyday users
A few prompts aren’t the problem. While Google does not publicly state how many prompts are queried in a single day, based on usage limits and the amount of models available for free to the public, it can be guesstimated at hundreds of millions of times daily. Although a single Gemini prompt is energy-efficient, multiplied, it starts adding up quickly.
Efficiency doesn’t eliminate growth. As AI becomes more deeply embedded in our lives, increasing overall consumption and impact on the environment will continue to grow. This is frequently referred to as the Jevons Paradox, an economic concept first described by British economist William Stanley Jevons in 1865. He observed that when coal-burning steam engines became more efficient, coal consumption didn’t go down, it actually went up.
The same goes for AI. As efficiency improves, we may find consumption of it going up. So instead of efficiency leading to conservation, it can ultimately mean more overall consumption.
Google says each Gemini prompt now uses significantly less energy and water than a year ago, which is a huge efficiency gain. But Jevons Paradox warns us:
Lower energy per prompt leads to AI feeling “cheap” to use.
Cheap use then leads to people and businesses relying on AI more, for everything.
More reliance leads to total energy and greater impact on the environment, which could still grow, even faster than before.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
Infrastructure demands are soaring. Across the U.S., AI-related energy consumption could triple by 2028, potentially boosting electricity prices and putting pressure on the grid; costs that may eventually trickle down to consumers
Why it matters
- Your AI use is small, but not negligible
A few prompts here and there won’t break the grid. But as AI becomes embedded in office tools, smart homes and search, expect your usage to grow. More queries mean more energy use. - Efficiency helps, but only if demand stabilizes
Google’s improvements are impressive. But the real test will be whether overall usage plateaus or, continues growing exponentially. - Demand smarter AI habits
Look for services that promote efficiency or transparency. Try to batch your prompt use to eliminate excess queries, support providers that publish sustainability metrics. - Stay curious about the infrastructure
Watch how data centers are being powered—renewables, nuclear, or continued fossil fuels? Infrastructure choices today will determine tomorrow’s environmental cost.
Bottom line
Google’s transparency is both an eye opener and a step forward towards managing AI usage. But while the per-prompt footprint seems quite small, the massive scale of global AI usage could still pose environmental consequences.
AI is increasingly woven through everything, making its hidden carbon and energy price grow right alongside our usage. It’s not one prompt we need to worry about, it’s all the prompts to come.
More from Tom's Guide
- I built 5 websites in under an hour with ChatGPT-5 — here’s how it’s possible
- I Tested ChatGPT-5 Against Claude Sonnet 4 in Coding Tasks — And There’s a Clear Winner
- How to use AI for writing — and still keep it authentically yours

Amanda Caswell is an award-winning journalist, bestselling YA author, and one of today’s leading voices in AI and technology. A celebrated contributor to various news outlets, her sharp insights and relatable storytelling have earned her a loyal readership. Amanda’s work has been recognized with prestigious honors, including outstanding contribution to media.
Known for her ability to bring clarity to even the most complex topics, Amanda seamlessly blends innovation and creativity, inspiring readers to embrace the power of AI and emerging technologies. As a certified prompt engineer, she continues to push the boundaries of how humans and AI can work together.
Beyond her journalism career, Amanda is a bestselling author of science fiction books for young readers, where she channels her passion for storytelling into inspiring the next generation. A long-distance runner and mom of three, Amanda’s writing reflects her authenticity, natural curiosity, and heartfelt connection to everyday life — making her not just a journalist, but a trusted guide in the ever-evolving world of technology.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.