AI’s environmental impact: what you can do
November 25, 2025 | by
Gemma
Artificial intelligence is rapidly transforming how we work, from search and creative tools to customer service and data analysis, but as AI adoption accelerates, so too does scrutiny of its environmental impact.
Headlines comparing AI training to long-haul flights or warning of “water-hungry chatbots” have captured attention. Yet, beyond the sensationalism lies a more nuanced truth: AI can be both an energy challenge and an efficiency opportunity.
While much of the debate focuses on model size or data-centre power, everyday choices from users and organisations also play a meaningful role in reducing AI’s footprint, which means we need to understand where and how it consumes resources if we’re to make its future more sustainable.
A growing digital appetite
Modern AI systems require enormous computational power. Training large models such as ChatGPT or image generators like Stable Diffusion involves weeks of work across thousands of GPUs, consuming vast amounts of electricity.
Even after training, AI’s “serving phase” — when millions of users interact with models daily — continues to draw significant power. Data centres already use over 1% of the world’s electricity, and AI is projected to be one of the fastest-growing contributors.
However, not all AI use is equal. A single AI query consumes only a few watt-hours, which is more than a Google search, but still a fraction of most people’s daily energy use. The issue isn’t individual consumption; it’s scale. Millions of small interactions accumulate into a notable carbon footprint.
This growing demand makes it even more important to consider how, when and why we use AI, as small choices multiplied across millions of users can influence overall energy consumption.
The carbon footprint of AI
The sustainability of any AI system depends heavily on where its electricity comes from.
Major cloud providers are increasingly sourcing renewable energy and improving efficiency, though rapid AI expansion has made it challenging to stay on track with net-zero pledges. A cloud query processed in a wind-powered data centre can have a lower footprint than the same task running locally on a fossil-fuel-powered grid.
As carbon-aware computing evolves, data centres are beginning to schedule non-urgent tasks for periods when renewable energy is abundant. It’s a reminder that smart timing, not just smart technology, can make a difference.
The water footprint of AI
Another lesser-known environmental cost of AI is water. Data centres use water primarily for cooling, particularly in warmer regions. Early reports suggested AI tools were consuming worrying amounts of water — even claiming that each ChatGPT conversation “drank a bottle of water.”
In reality, updated research paints a calmer picture. A typical AI query uses just a few millilitres, roughly a teaspoon, of water when cooling and electricity generation are considered. Yet the scale of global AI usage still means the total impact is significant, especially where water is scarce. Water use may be small per query, but collective demand still matters, particularly as AI becomes part of routine workflows and creative tasks.
How to lower your AI footprint
While each AI query carries a small environmental cost, the way we use these tools collectively has a far greater impact. By making intentional, informed choices, organisations and individuals can reduce unnecessary computation, cut energy use and improve the quality of the outputs they receive.
- Use AI when it genuinely adds value: avoid novelty trends or tasks that don’t need large, resource-intensive models behind them.
- Write clear, targeted prompts: stronger prompts mean fewer retries, fewer tokens and less energy use.
- Choose focused tools when possible: specialist apps for tasks like transcription, grammar checks or summarising are far more efficient than general-purpose models.
- Select smaller models for everyday work: many platforms offer size options — lighter models are often enough and significantly less resource-intensive.
- Reduce re-generation and unnecessary variations: plan what you need before prompting to avoid repeated outputs or endless iterations.
- Be mindful with image and video creation: media models consume considerably more energy, so generate only what you will use and opt for lower resolutions where possible.
- Choose providers with strong renewable energy commitments: look for platforms with clear sustainability reporting, renewable energy sourcing, transparent data-centre practices or verifiable environmental commitments.
Choosing responsible providers is becoming much easier as transparency improves. Resources such as RE100 (showcasing companies committed to 100% renewable electricity), and the Science Based Targets initiative offer useful indicators of credible climate action. Many cloud and AI platforms also publish annual sustainability reports outlining their renewable energy mix and data-centre efficiency (such as Amazon AWS, Microsoft and Google, the leading AI suppliers). Exploring these sources can help you partner with providers whose environmental commitments align with your own.
Working towards a greener digital future
As AI becomes embedded in daily business operations, organisations have the opportunity to lead by example — applying the same environmental responsibility to digital systems as they do to physical ones. Small shifts in digital behaviour, scaled across teams and organisations, can make a tangible difference.
At Mosaic, we believe technology and sustainability can go hand in hand. Through our Conscience Marketing™ approach, we help organisations harness innovation responsibly, ensuring progress benefits both people and the planet.
Let’s use AI not just to work smarter, but to build a more sustainable digital future. If you’d like to explore how your organisation can adopt AI sustainably, get in touch with the Mosaic team to start the conversation.