Every time you ask an AI system to generate an image or solve a complex problem, servers somewhere are burning electricity and drawing on scarce freshwater for cooling. 

As debates continue over how AI might help fight climate change, we risk overlooking the irony: AI itself is becoming one of the fastest-growing drains on energy, water, and land.

Training vs. Inference

Some people assume AI's biggest environmental hit comes from training—those headline-grabbing moments when companies build massive models at enormous energy costs. But training is a one-time event, like constructing a data center. The real environmental drain is inference: the billions of daily queries sent to models as people generate text, images, code, and videos. 

Think of it this way: training builds the model, but inference runs it 24/7. Recent estimates suggest inference accounts for up to 90% of a model's lifetime energy consumption.

Not All Queries Are Created Equal

The energy gap between different AI tasks is staggering. By some estimates, text classification might consume 0.002 kWh per thousand queries, while image generation can demand 2.9 kWh for the same number—a 1,450-fold difference. So when you ask AI to create a picture instead of answering a text question, you're potentially using thousands of times more energy. Text-to-image (e.g., Stable Diffusion, MidJourney) is by far the most carbon- and energy-intensive task.

“Reasoning models” like OpenAI's o1 have even created a new category of energy consumption. While traditional models generate responses in one pass, reasoning models first "think through" problems step-by-step before answering. This internal deliberation, often invisible to users, can triple or quadruple the computational cost of a single query. Meanwhile, OpenAI's black-box routing system automatically decides which model to handle a request, leaving users unable to choose more energy efficient options for their needs.

Practical Strategies for Lowering Your AI Footprint

Choose the right tool for the job
Current research suggests that DeepSeek and o3 are some of the most energy-intensive models, while GPT-4 and Claude Sonnet (highest in eco-efficiency) probably consume less resources. Using a massive general-purpose model (e.g., Gemini, GPT-5) for a simple task is like hiring a team of engineers to change a light bulb. 

Whenever possible:

  • Select smaller, fine-tuned models for routine tasks.
  • Favor closed-domain prompts over open-ended inquiries. Closed-domain queries (like fact lookups or classifying) tend to consume less energy than open-domain prompts that trigger advanced (or multi-modal) reasoning. 

Be intentional in prompt design
Even word choice matters. Prompts that ask the LLM to “analyze” trigger more energy-intensive reasoning than those that ask to “classify”. 

To reduce unnecessary computation:

  • Use precise, direct language.
  • Break complex tasks into steps.
  • Set explicit output limits (constraints) to avoid long responses.
  • Combine related questions at the outset. Batching queries avoids repeated context-building, which consumes additional compute.
  • Input previous results (e.g., via file upload) instead of repeating the same query anew

AI evangelists all suggest that efficiency gains are coming. But efficiency alone won’t solve the problem. As AI becomes more efficient, usage will probably just expand. 

A sustainable approach requires intentional limits:

  • Track your AI use! 
  • Set boundaries on high-energy tasks
  • Consider whether AI is truly adding value or merely substituting for work you could do without it.

Tools for Informed Decision-Making

New resources like Ecologits, ML Energy, and AI Energy Score have attempted to estimate energy costs across models. Of course, these tools face data limitations, since AI companies rarely disclose full water or energy footprints. Importantly though: they represent a starting point for greater transparency and can help us all do better.

The Path Forward

The future of sustainable AI isn’t just about better algorithms; it’s about better habits. Before each query, ask yourself:

  • Does this task genuinely require AI assistance?
  • Am I using the most efficient tool available?
  • Could a simpler approach achieve the same result?

Efficiency and necessity aren’t the same thing. Building awareness into our daily use is the most immediate step we can take while waiting for providers to disclose more data and design more efficient systems. The choice isn’t between using AI or rejecting it, it’s about when and how to use it responsibly, especially as our planet suffers in the wake.

 

References

Li, P., Yang, J., Islam, M. A., & Ren, S. (2025). Making AI less" Thirsty": Uncovering and addressing the secret water footprint of AI (Version 5). https://doi.org/10.48550/arXiv.2304.03271

Jegham, N., Abdelatti, M., Elmoubarki, L., & Hendawi, A. (2025). How hungry is ai? Benchmarking energy, water, and carbon footprint of LLM inference (Version 3). https://doi.org/10.48550/arXiv.2505.09598

Luccioni, S., Jernite, Y., & Strubell, E. (2024). Power hungry processing: Watts driving the cost of AI deployment? (Version 3). https://doi.org/10.48550/arXiv.2311.16863

Adamska, M., Smirnova, D., Nasiri, H., Yu, Z., & Garraghan, P. (2025). Green Prompting (Version 2). https://doi.org/10.48550/arXiv.2503.10666

Vandenbergh, M. P., Thorpe, E. I., & Gilligan, J. M. (2025). The Energy and Environmental Footprint of AI. Vanderbilt Law Research Paper, 25-11.