Turning thoughts into reality: Can AI truly capture your vision?

AI has quickly become a creative companion for many of us: helping write captions, summarise research, and even generate design concepts in seconds. It’s fast, accessible, and often feels like magic. But as I’ve learned through my own design experience, while AI can assist in creating, it doesn’t always understand what we envision.

 

Written by Hui Qi

 
 
 

As designers, our decisions are shaped by emotion and context — how we want something to feel and what we want it to communicate. AI doesn’t truly grasp that. It predicts patterns, not intentions. 

With AI becoming a bigger part of our creative process, this difference feels increasingly important to reflect on. 

When AI met my Deepavali design

Recently, I was tasked with creating a Deepavali social media post for my company. I wanted the artwork to feel festive yet simple — something warm, symbolic, and meaningful. 

I started the process by asking ChatGPT to help me brainstorm ideas. It gave me plenty of suggestions: vibrant rangoli patterns, glowing lamps, colourful gradients. They were all interesting, but one idea stood out to me: a negative space illustration. It felt clean and thoughtful, something that aligned with how I envision the design. 

So, I decided to bring that idea to life using AI image-generation tools. But the results weren’t what I imagined. The designs came out overly complicated, packed with unnecessary details, or missing the subtlety I was looking for.  

Eventually, I returned to Figma and created the design myself. It took longer, but the process felt intentional. Every decision came from understanding what the design meant, not just how it looked. 

 
 

Generating deepavali designs using AI models like Firefly/ Google Imagen

 
 

A second try: animating with AI

Once the static artwork was completed, I decided to give AI another opportunity — this time to animate the design. I wanted just a gentle movement in the flame to bring a sense of warmth and life to the piece. 

However, AI had other ideas. Instead of subtly animating the flame, it animated almost everything else, adding unnecessary motion and overcomplicating something meant to feel calm and reflective. 

 
 

Some examples of AI adding unnecessary complexity to the animation

 
 

That experience taught me that AI interprets intructions in a very literal way. It doesn’t understand nuance or intent; instead, it analyses patterns, predicts outcomes, and generates what it thinks you might want based on probability. Where humans read between the lines, AI reads data. It can follow directions perfectly and still miss the meaning behind them, because it doesn’t create with feeling or purpose, only logic. 

Why AI feels different depending on the role

This experience also made me reflect on how AI behaves differently across creative roles. 

For UX designers, AI often acts as a valuable thinking partner, helping to summarise research, identify user needs, or generate user journeys. These are logical, structured tasks that align well with how AI operates. 

But for visual designers, the process is more intuitive. Our work involves emotion, composition, and cultural nuance — aspects that can’t be quantified. AI can replicate aesthetics, but it struggles to understand the why behind them. It can generate visuals that look polished, but they often lack soul. 

 
 
 
 

Final thoughts

While AI can be a helpful assistant, it’s still far from understanding the heart of design. Our AI tools can follow instructions and reproduce patterns, but they doesn’t share our intent or emotion. 

AI may help you create faster, but not necessarily better — because good design isn’t just about what looks right, but what feels right. 

So, can AI truly capture your vision? 
Maybe not. But in revealing what AI can’t do, we are reminded of why the human touch still matters. 

 
Next
Next

Is Singapore’s digital banking truly accessible to everyone?