The green cost of AI: What designers must consider beyond the interface

As AI adoption skyrockets, so does its environmental impact, racking up massive carbon emissions, water use, and e-waste. This article shows the hidden costs behind large language models like ChatGPT and what designers can do to reduce AI’s footprint. Learn how smarter design choices can lead to more sustainable AI futures.

 
 

Written by Sofiah Ridwan

 
 
 
 
 

The Hidden Cost of AI Innovation

As the use of artificial intelligence skyrockets across industries, from marketing and customer service to product development and healthcare, few stop to ask: What is the environmental impact of AI?

Spoiler: it’s significant—and growing fast.

Every interaction with AI models like ChatGPT, Midjourney, or Google’s Gemini consumes energy, water, and resources. While AI is marketed as a clean, digital solution, its infrastructure is carbon-intensive and resource-hungry.


 

The carbon footprint of AI: How much energy does AI use?

Training and running large AI models require enormous computing power. A study by Strubell et al. (2019) found that training a single Natural Language Processing (NLP) model with architecture search consumed 1,287 MWh of electricity and emits approximately 552 metric tons of CO₂. To put that into perspective, that’s the same as fully charging 70 million smartphones at once.

But models have grown since then. GPT-3 had 175 billion parameters, requiring far more compute than the models studied in 2019. GPT-4 is estimated to have used 10 to 100 times more training compute than GPT-3. In emissions terms, that’s like going from charging 70 million smartphones to full capacity, to charging up to 7 billion phones. And with GPT-5 expected to scale even further, we may soon be looking at energy use equivalent to charging the world’s entire smartphone population, many times over.

And this is just training. Daily user interactions with AI accounts for even greater emissions over time.

A 2023 IMF report projected that AI-related emissions could reach 1.5 gigatons of CO₂ annually by 2030 if left unchecked. This is equivalent to charging nearly 36 billion smartphones in a year.

 
 
 

The AI water crisis: AI’s hidden thirst

AI systems are also remarkably water-intensive, largely due to the need to cool overheating data centres.

According to a 2023 study*   from UC Riverside, a typical ChatGPT session of 20–50 prompts consumes the equivalent of a 500ml bottle of water. That doesn’t seem too bad, right? Until you multiply that by the millions of people that talk to ChatGPT daily. Projections show that by 2027, AI systems may consume up to 6.6 billion cubic meters of water every year. That’s enough to sustain half of the UK for a whole year!

 
 
 

Even major tech companies have started disclosing massive water consumption spikes. Google reported a 20% increase in water use in 2022, driven by data centre demands linked to AI workloads.


What can designers actually do to reduce AI’s environmental impact?

AI feels invisible most of the time — just lines of code running in the background. But the truth is, every AI feature we recommend, design, or ship has a carbon cost. As designers, we’re not powerless bystanders in the climate conversation. We’re co-pilots.

So how can we be more intentional? Here are concrete ways designers can reduce AI’s impact, whether you’re working on consumer apps, enterprise tools, or internal platforms.

1. Design with purpose

AI is often treated like a glittery add-on. “Can we use AI here?” becomes the question, not “Should we?”. Instead, ask:

  • Does this AI feature solve a real user pain point?

  • Would simpler logic (like if-else rules or filters) do the job

  • Will this be used frequently enough to justify its carbon footprint?

💡 Try this: Add a “Why AI?” checklist to your design process to help product teams justify its inclusion.

2. Advocate for model efficiency early

Most designers don’t pick the model architecture, but we can shape how it’s used. Light, task-specific models are often enough for user needs. Start conversations like:

  • “Could we use a smaller, distilled model for this?”

  • “Is this model fine-tuned for our dataset, or are we brute-forcing a generic one?”

💡 Try this: During design reviews, ask for model size, frequency of use, and inference cost as part of your AI feature documentation.

3. Design for frequency & triggers

AI features that run every time a user opens an app or scrolls a page are energy hogs. Features that run once and cache results are much cleaner. Design decisions that make a difference.

  • Debounce auto-suggestions to reduce repeated calls.

  • Use AI on submit, not on every keystroke.

  • Cache AI-generated results (especially for content recommendations).

💡 Try this: Work with engineers to map where and how often AI gets triggered. Then optimize for the “green path.”

 For example, Spotify engineers and product teams analyse when and how often AI-powered recommendation algorithms run to serve personalised playlists. They’ve optimized their systems to:

  • Cache popular recommendations

  • Batch AI computations during off-peak hours

  • Reduce repeated AI calls when users skip tracks rapidly

This reduces unnecessary compute and cuts down on energy use in their backend systems. Spotify’s precise implementation details remain undisclosed as they are proprietary.

 4. Surface the invisible cost

Just like we normalize privacy notices or battery usage indicators, we can help normalize the environmental impact of digital actions. Some ideas include:

  • "Eco-friendly AI" indicators for low-carbon experiences.

  • Optional toggles for users to choose lightweight vs. richer AI modes.

  • Microcopy that explains when and why AI is being used. 

💡 Try this: Include micro-interactions or affordances that show AI is working intentionally, not automatically.

For example: Figma’s “AI Remove Background” feature: When removing a background using AI, Figma shows a smooth progress animation on the image that gradually fades to the AI-edited image.

 

5. Join (or start) the sustainability conversation

Designers are community builders. The more we talk openly about the invisible energy costs of AI, the more we empower others to think beyond pixels. 

💡 Try this:

  • Start a Slack/Notion thread on “sustainable AI practices” at work

  • Share resources, like those from the Green Software Foundation.

  • Propose a lightning talk at your next design crit or all-hands or simply just share your thoughts on sustainable AI practices on socials


Surprising facts about greener AI

Most people think AI’s environmental cost is unavoidable—but smarter, more sustainable practices already exist. Here are a few interesting things:

  1. Did you know? AI can run cleaner at certain times of the day.

    Companies like Microsoft are experimenting with carbon-aware computing, scheduling AI workloads during periods when renewable energy (like solar) is strongest.

    It’s like doing your laundry during off-peak times, when electricity is cheapest—but for AI training.

    💡 Designer tip: Could your product delay or batch AI tasks to align with greener energy windows?

  2.  Did you know? AI doesn’t always need the cloud.
    On-device AI (like Apple’s Siri or Google’s Gboard) cuts out the energy waste of constantly sending data back and forth to servers. This not only reduces emissions but also improves privacy, server load and speed.

    💡 Designer tip: Could your app handle simple AI features locally instead of calling the cloud every time?


Small actions, big ripple effect

Designing responsibly doesn’t mean avoiding AI. It means choosing it consciously, asking smarter questions, and designing with context — not just convenience. 

We won’t solve the climate crisis with button placements, but we can shape a tech industry that cares about the long tail of what we build.

And that starts with us.

 
 
 
Next
Next

UX for beginners: What I learnt after interning in UX for a month