Original Research
Using What We Don't Trust
The State of AI in Design, 2025: In Designers' Own Words
Last Updated: December 20, 2025
There's a gap between what we're saying about AI and what we're feeling about it.
Scroll LinkedIn and you'll find confident narratives: productivity soaring, transformation underway, future belonging to those who embrace it. But talk to designers privately, in Slack channels, Reddit threads, quiet conversations after work, and a different picture emerges. One marked by anxiety, ambivalence, and questions without clean answers.
I wanted to understand that gap.
So I went looking. Not for hot takes, but for what designers and researchers are actually experiencing, in their own words. I gathered data from User Interviews, Figma, Maze, Nielsen Norman Group, UXPA, Roy Morgan, and dozens of community discussions. Over 80 sources. Nearly 100 direct quotes.
What I found wasn't simple.
It's a profession in genuine tension. Using tools they don't trust. Watching the apprenticeship pipeline collapse. Gaining efficiency while losing meaning. Questioning not just their jobs but their purpose.
I should be clear: I'm not a neutral observer. I run a practice focused on human-centred AI design. I've had my own gut aches over the past three years. Questioned my relevance, my direction, my craft. I've also come out the other side more creatively empowered than at any point in my twelve-year career.
Both things are true.
If you're feeling the tension, between using AI and not trusting it, between recognising opportunity and fearing what's lost, you're not alone.
You're not alone.
Before We Go Further
I want to name something.
This isn't one experience.
A junior designer in Melbourne staring at an "empty void" of opportunities is holding something different than a design director in New York navigating boardroom pressure. A UX researcher watching 91% of their peers worry about hallucinations is in a different place than a visual designer watching their craft approximated in seconds.
The research surfaced these differences. I want to let some of them speak before we move into the shared themes.
The Voices
"I graduated with honours. I've applied to over 200 jobs. I'm starting to wonder if I chose the wrong career at the wrong time."
– Junior Designer
"I'm a Senior Designer, unsure of my next steps. With the rise of AI absolutely everywhere now I'm really scared about my future. I don't have a contingency plan in place, so I'm wondering where to begin."
– Senior UX Designer, Reddit
"AI can now do tasks that juniors used to cut their teeth on. Background removal, fast iterations, even producing bulk screens. And in 5-10 years, we'll face a massive gap with no healthy pipeline of talent to fill it."
– Design Leader
"AI makes me feel like a fool. When I look at AI-generated art, I'm reminded of the countless hours I've dedicated to freelance work as a single parent just to make ends meet."
– Graphic Designer
"Cheapening of artwork and higher amounts of AI slop."
– Creative Professional, Australia
"It undermines the value of human labour."
– Creative Professional, Australia
24%
of workers 18-34 rate job-loss concern 8+ out of 10
10%
of workers 55+ share the same concern
62,000+
tech layoffs in Americas, H1 2025
But the numbers aren't the point. The point is: where you sit shapes how this lands.
The themes that follow are shared. The weight of them is not.
I wanted you to be able to find yourself in this before we go further.
1
The Adoption Paradox
The Finding
AI adoption among designers and researchers has surged. 80% of UX researchers now use AI tools, up 24 percentage points year-over-year. Yet trust hasn't followed. Among those using it, 91% worry about accuracy and hallucinations. Only 32% say they can rely on the output.
We are adopting tools we do not believe in.
The Deeper Context
This isn't enthusiastic adoption, it's compelled usage. The pressure is largely external: organisations want more output, faster turnaround, competitive positioning. Designers and researchers are caught between institutional demands and professional instinct.
The gap is particularly acute for researchers. User Interviews' 2025 State of Research report found that while AI usage has become near-universal, sentiment remains net negative. 41% negative versus 32% positive. The very professionals trained to understand human experience are being asked to trust systems that flatten it.
The Voices
"2025 is clearly the year that AI ate common sense."
– Director of Insights Operations
"AI has the potential to be the researcher's best friend, by doing all the heavy lifting associated with analysis, but it also has the potential to cause unimaginable damage."
– UX Researcher
"I don't know if these insights are real or if I'm just being told what patterns look like."
– Research Participant, User Interviews Survey
"We're all using it. Nobody's saying they trust it."
– UX Designer, Reddit
What Remains Unresolved
If 91% don't trust the outputs but 80% are using them anyway, what does 'adoption' actually mean?
Are we building expertise, or building dependency on systems we don't believe in?
And what does compelled usage, without conviction, do to a profession built on rigour and integrity?
💭 Riley's Thoughts
Here's what I keep sitting with: the distrust you feel using these tools? That's exactly what your users feel when they encounter your AI product.
So this paradox isn't just an industry observation. It's a mirror.
There are specific signals, microinteractions, microcopy, onboarding mechanics, that build trust. And specific ways to break it. When trust breaks with AI, people churn at twice the rate of non-AI products.
The tension between using something and not trusting it, that's not unique to designers. It's the lived experience of everyone encountering AI right now. Including your users.
2
The Pipeline Severance
The Finding
Entry-level design hiring has collapsed by 50% since 2019. In Australia, the market is described as 'especially fragile.' Meanwhile, 35% of UX teams reported staff losses in 2024, double the rate from 2022.
The tasks that once trained junior designers, transcription, basic wireframing, initial synthesis, are increasingly handed to AI. The apprenticeship model that built expertise through iteration is quietly breaking.
We are automating the runway that taught people to fly.
The Deeper Context
This isn't simply an economic contraction. It's a structural shift in how expertise develops.
Design and research skills have traditionally been built through supervised repetition, the "grunt work" that, while tedious, taught pattern recognition, quality judgment, and contextual sensitivity. Moderating your fiftieth interview teaches you things your first ten couldn't. Iterating through dozens of wireframe variations builds intuition that no course delivers.
When AI absorbs these tasks, the efficiency gain is real. But so is the loss. Juniors entering the field now face what one design educator calls an "empty void" of opportunities, roles that no longer exist, or exist in forms that skip the foundational learning entirely.
The consequences compound. Without hands-on experience, emerging designers lack the tacit knowledge that allows them to recognise when AI outputs are wrong, superficial, or subtly biased. They haven't built the 30% that senior practitioners bring, the part AI cannot replicate.
The Voices
"There is an 'empty void' of opportunities for junior designers."
– Design Educator
"AI can now do tasks that juniors used to cut their teeth on... in 5-10 years, we'll face a massive gap."
– Design Leader
"Entry level hiring is cooked. The ladder doesn't exist anymore."
– UX Designer, Reddit
"Learning design through tools instead of people can shortcut the slow, formative work that builds creative muscle. Without exposure to the subtle cues of live collaboration, how a senior designer navigates client pushback, how a project pivots midstream or when to break the rules, new designers risk missing the soft skills required for real-world effectiveness."
– It's Nice That
"We're not just losing jobs. We're losing the mechanism that creates expertise."
– Research Leader
What Remains Unresolved
If the tasks that built expertise are now automated, how do emerging practitioners develop the judgment AI cannot provide?
Are we creating a generation of designers who can prompt but cannot evaluate?
And who bears responsibility for rebuilding the apprenticeship model, individuals, organisations, or the industry itself?
💭 Riley's Thoughts
Yes, the pipeline is breaking. That's real, and it's painful.
But here's the conversation I think we're missing.
Those young people, the ones struggling to find entry-level roles, they're digital natives. They've grown up with digital tools as an extension of themselves. And they're coming out of education without all the 'this is how we've always done it' baggage that we carry. They don't have to unlearn anything.
That's not a weakness. That's an advantage.
What if we stopped seeing junior hires as charity, or as overhead to be optimised away, and started seeing them as strategic capability? A senior with deep expertise paired with a junior who thinks AI-first? That's not mentorship as a burden. That's a powerhouse.
I've just returned from six years in the EU, where internship culture is strong. We don't have that tradition in Australian design. And right now, with junior roles scarce and senior teams stretched, it's an untapped opportunity.
Fresh minds plus seasoned expertise. That's a strategy that could build AI-augmented capability from the ground up, sustainably, not by automating the apprenticeship away, but by redesigning it.
3
The Productivity Paradox
The Finding
89% of designers report that AI has improved their workflow. Efficiency gains are real and measurable. Tasks that once took hours now take minutes. The promise of AI-augmented productivity is not a myth.
And yet. Burnout persists. Expectations escalate. The time saved is immediately absorbed by demands for more output, faster turnaround, additional iterations. The efficiency gain exists, but designers aren't experiencing it as relief.
We're producing more. We're not reflecting more.
The Deeper Context
The productivity promise of AI contained an implicit hope: automate the tedious, reclaim time for thinking. Free designers from grunt work so they can focus on the creative and strategic work that matters.
That's not what's happening.
Instead, efficiency gains are being captured by the organisation, not the individual. Faster output creates an appetite for more output. What was once a week's work becomes a day's expectation. The ceiling rises to meet the new floor.
The result is a curious inversion. Designers report feeling more productive and more depleted. More output and less meaning. The tools work. The humans are wearing down.
The Voices
"AI tools were meant to automate the boring bits... Instead, creatives speak of endless iterations, escalating client demands and entirely new categories of digital drudgery."
– It's Nice That
"Whoever thought AI would free creatives to be more creative wasn't paying attention; it pushes us to produce more, not reflect more."
– Design Director
"I'm outputting more than ever. I've never felt less creative."
– UX Designer, Reddit
"The time I save with AI gets immediately filled with more requests. There's no net gain, just more volume at the same exhaustion level."
– Product Designer, LinkedIn
"We used to have time to think. Now we have time to produce. Those aren't the same thing."
– Research Lead
What Remains Unresolved
If efficiency gains are absorbed by escalating demands, what was actually gained?
Is AI amplifying our capacity or merely accelerating the treadmill?
And how do we design workflows, and AI products, that protect space for the thinking that quality requires?
💭 Riley's Thoughts
I want to be honest: this isn't all doom and gloom.
I feel more creatively empowered as a solo operator right now with 3 teams of AI agents that help extend my capabilities in areas I have never had skills or expertise in. The opportunity is real. And from my conversations with design leaders and practitioners, there's an enormous gap between those who've figured out how to leverage AI well and those just coming to it now.
But efficiency without quality is just faster mediocrity.
The only way to get quality is through that 20-30% of human value-add, the expertise, the wisdom, the creative judgment. Without it, you get homogenised slop. Work that sounds like it's saying something until you realise it isn't.
Here's my provocation: the pause is where the power is.
Not everything should be faster. Strategic friction, I know that phrase breaks the brain of every designer who's spent a decade removing friction, but hear me out.
Creative thinking happens in pauses. Critical thinking. Ethical judgment. The moment where you look at AI output and say, 'No, that's not quite right, here's why.'
That's not inefficient. That's where the value lives. Because if it were truly one-and-done, if the button push was all that mattered, you'd already be out of work.
4
The Homogenisation Fear
The Finding
When asked whether AI improves the quality of their work, designers are notably less convinced than their developer counterparts, 54% versus 68%. The gap reveals something beyond mere scepticism about tooling. It points to a deeper anxiety about what happens when creative work passes through systems trained on averages.
The fear has a name now: homogenisation. The flattening of creative output toward a competent, frictionless, forgettable middle.
When everyone draws from the same well, everything starts to taste the same.
The Deeper Context
AI systems are trained on vast corpuses of existing work. They learn patterns, absorb conventions, and generate outputs that reflect the statistical centre of what they've consumed. This is a feature, not a bug, it's how the technology works.
But for creative disciplines, the statistical centre is precisely what you're trying to escape.
Great design has always been about purposeful deviation, knowing the rules well enough to break them meaningfully. It's the unexpected word choice, the counterintuitive layout, the friction that makes you pause. Distinctiveness lives at the edges, not the middle.
When AI handles the first draft, the base layer, the "starting point," it anchors the work toward convention. The designer can push against this, but pushing against a current takes more energy than swimming in open water. The gravitational pull toward average is subtle but persistent.
The Voices
"When everybody uses the same few 'smart' tools to craft their digital products, it becomes hard to differentiate one from another. Although AI-built user interfaces are usually clean and frictionless, they often lack personality and soul."
– Miyagami Design Agency
"AI-generated images are too homogenous in style. This visual 'sameness' carries over into industrial design, fashion, and game design as well."
– BMW Designworks
"It's all starting to look the same. Clean, competent, completely forgettable."
– Creative Director, LinkedIn
"I can spot AI-assisted work now. Not because it's bad, because it's indistinct. It has no fingerprint."
– Senior Designer
"The first draft is faster. But I spend longer trying to make it not sound like everything else."
– Content Designer
What Remains Unresolved
If AI trends toward the average, and we increasingly rely on AI for first drafts and base layers, are we slowly eroding the conditions for distinctiveness?
Can tools trained on convention ever support genuine originality?
And what happens to creative culture when the path of least resistance leads everyone to the same place?
💭 Riley's Thoughts
This is the thing I keep coming back to: AI trends toward the middle.
If you're not great at something, AI can lift you up. It can make a weak first draft sound competent. That's real, and it's valuable. But it's also lowering the ceiling.
When you run your best thinking through a system trained on averages, it sands off the edges. The idiosyncratic phrasing. The unexpected choice. The thing that made it yours. What comes back is smoother, and flatter.
Without that 20-30% of human value-add, the expertise, the taste, the willingness to say 'no, that's not quite right', you get homogenised slop. Work that looks professional but isn't actually connecting.
And here's what worries me, this compounds. As more AI-assisted work enters the world, it becomes the new training data. The middle learns from the middle. The edges get harder to find.
Your job now is to protect the edges. To be the one who says, 'This is too smooth, where's the texture?'
5
The Identity Crisis
The Finding
In online design communities, 75% of discussions about AI mention job security or role evolution. Among workers aged 18-34, nearly a quarter rated their job-loss concern at 8 or higher out of 10.
Fear isn't abstract. It's personal, present, and often unspoken in professional settings, even as it dominates private conversations.
Designers are questioning not just their jobs, but their purpose.
The Deeper Context
Design has always carried more identity weight than most professions. Designers don't just do design, they are designers. The craft is woven into self-concept in ways that accounting or project management rarely match.
This makes the current moment particularly destabilising.
When AI can generate a wireframe, write a content strategy, or produce a visual concept in seconds, it doesn't just threaten tasks. It threatens the narrative designers have built about their own value. The internal story that said "I'm the one who can do this" meets a machine that can produce something close enough, fast enough, to make the claim feel precarious.
The response is often private shame. Designers report feeling foolish for struggling with tools others seem to master. They question whether their hard-won skills still matter. They wonder if they've chosen the wrong profession at the wrong moment in history.
And because this shame is private, it compounds. Everyone assumes everyone else has figured it out. The isolation reinforces the crisis.
The Voices
"AI makes me feel like a fool. When I look at AI-generated art, I'm reminded of the countless hours I've dedicated to freelance work as a single parent just to make ends meet."
– Graphic Designer
"I've spent 15 years building expertise that a tool can now approximate in 30 seconds. What does that make me?"
– Senior UX Designer, Reddit
"I sense a shift, like I'm on a path to stagnation, potentially fading into obscurity."
– Visual Designer
"It's not that I can't use the tools. It's that I don't know who I am if the tools can do what I do."
– Product Designer, LinkedIn
"Everyone's posting about how AI makes them more productive. I just feel more replaceable."
– Junior Designer
What Remains Unresolved
If design identity is threatened by AI capability, how do practitioners rebuild a sense of purpose?
Is this a temporary crisis of adjustment, or a fundamental renegotiation of what design means?
And how do we create space for honest conversation about professional fear in cultures that reward performance of confidence?
💭 Riley's Thoughts
I feel this one in my gut.
Most designers I know have a lot of identity wrapped up in what we do. If I had a drug of choice, it would be creative problem-solving. It's part of who I am, part of who I want to be.
So I want to tell you something honestly: I've been doing this for a bloody long time. And I've never felt more creatively empowered than in the last three years.
AI isn't killing your craft. It's offering you a new medium for it.
I build my own tools now, not because I suddenly became a coder, but because I have ideas about problems I want to solve, and I know how to think through a problem, find the best information and how to get AI to build them with me. And not to an average level, but based on best-in-class examples. That's not replacing my creativity. It's extending it.
But I had to go through the identity crisis to get here. I had my own months of 'what does this mean for me?' I grappled with it. And I came out the other side loving what I do more than I have in years.
Here's something I want you to do.
Think of a piece of work you're genuinely proud of. Something that got meaningful feedback, or something you were just quietly, intrinsically proud of. Now unpack it. What context did you bring that no brief contained? What micro-decisions did you make along the way? What did you notice that someone else might have missed? What creative leap came from your particular way of seeing?
All of those things, every single one, are things only YOU could bring.
That's not abstract value. That's the living, breathing embodiment of what AI cannot replicate.
Don't let that be sanded off. Don't let efficiency flatten it. The uniqueness you bring to a specific context, that's how we distinguish ourselves in an AI-driven world. Don't lose yourself.
6
The Human Remainder
The Finding
Despite the anxiety, a counter-signal persists: 52% of designers believe design has become more important for AI-powered products, not less. Research into human-AI collaboration found that combined outputs score 37% higher on originality than AI alone.
The data suggests something the fear obscures: AI's limitations create irreplaceable space for human contribution.
What AI cannot do is precisely where human value lives.
The Deeper Context
AI systems are prediction engines trained on patterns in existing data. They excel at interpolation, finding the space between known points. They struggle with extrapolation, moving beyond what exists toward what could.
More fundamentally, they lack context.
Not superficial context, AI can process vast information about a domain. But embodied context. Lived experience. The tacit knowledge that comes from being a person in the world, navigating relationships, feeling consequences, holding contradictory truths.
When a researcher observes an interview subject pause before answering, look away, shift in their chair, that's data. When they sense what the pause means in this specific context, for this specific person, in relation to everything else they've learned, that's understanding. AI can record the pause. Only humans can interpret its meaning.
This limitation isn't temporary. It's structural. AI doesn't have a body. It hasn't lived. It cannot know what it means to be afraid, to love someone, to face mortality, to feel the weight of an ethical choice. It can process descriptions of these experiences. It cannot have them.
The Voices
"What AI cannot replace is the human ability to think strategically, to empathise with consumers, and to create something truly original."
– David Droga, Creative Chairman, Accenture Song
"AI can read the data. But it can't read the room."
– Head of Design
"The value isn't in producing outputs. It's in knowing which outputs matter."
– Jeff Gothelf, Author/Product Coach
"AI gives me answers. It can't tell me if I'm asking the right questions."
– Senior Researcher
"We're not being replaced by AI. We're being challenged to prove what only we can do."
– Design Director
What Remains Unresolved
If human value lives in context, judgment, and lived experience, how do we cultivate these in a world accelerating toward automation?
How do we protect the pause for reflection when everything demands speed?
And how do we help organisations understand that the irreplaceable human contribution isn't overhead to be optimised, it's the source of quality itself?
💭 Riley's Thoughts
This is where I want to speak directly to my fellow human-centred designers.
We have built our careers on bringing empathy to understanding, the lived experience, the needs of customers, and responding with designs that invent the future. That capacity isn't diminished by AI. It's more vital than ever.
Because here's what we need to understand: we're not designing AI systems for the systems themselves. We're designing for humans to interact with AI. We're designing a new form of intelligence to collaborate with our intelligence.
That makes this fundamentally a relationship-building challenge.
AI operates on one layer: logic, reasoning, pattern matching. But when humans make decisions, we're processing on multiple layers simultaneously. There's the logical layer, where AI participates. But there's also emotional processing: how will this affect others, how will I be perceived? And there's fear, the layer that enables or shuts down our ability to think creatively, to experiment, to see strategically.
You've heard the fear throughout this entire piece. It's pervasive. And fear functionally impairs creative thinking. It closes us down.
Here's something else that's changed. Every time you use a generative AI system, even with the same prompt, you get a different result. We used to design and control user flows. We mapped them out. But now, the flow happens during the experience.
So the goal now needs to be: designing the best conditions for healthy collaboration. The mechanics that help humans and AI partner well together. The trust signals, the friction points, the moments of human agency.
I want to activate human-centred designers more than any other professionals to not lose that soul we bring to our work. Because it is more important than ever.
Don't give up your empathy. Don't give up your contextual understanding. Don't give up your ethical judgment. Those aren't soft skills to be optimised away, they're the entire point. We're building humanity's future relationship with intelligence that may soon surpass our own. And you, the person trained to centre humans in everything you design, are exactly who should be shaping that relationship.
What Remains
I started this research wanting to understand the gap between what we say about AI and what we feel about it.
What I found was a profession holding multiple truths at once.
We're adopting tools we don't trust.
We're gaining efficiency while losing meaning.
We're watching the path that built expertise disappear.
We're questioning not just our jobs, but our purpose.
And yet, we're also discovering that what AI cannot do is precisely where our value lives.
I haven't tried to resolve these tensions. I don't think they can be resolved, not yet, and maybe not by any one person. They're the lived reality of a profession in transformation.
But I do believe that naming what's unnamed helps. That hearing your own experience reflected in others' words reduces the isolation. That honest conversation, even when it doesn't provide answers, is more useful than confident prescriptions that paper over complexity.
If this piece has done its job, you feel a little less alone. A little more seen. And perhaps a little clearer about where your own thinking sits amid all this noise.
The questions remain open.
What does meaningful adoption look like when trust lags behind usage?
How do we rebuild the apprenticeship model for a world where AI handles the grunt work?
Where does efficiency end and meaning begin?
How do we protect distinctiveness when the tools trend toward average?
What does it mean to be a designer when machines can design?
And who is shaping humanity's relationship with AI, if not us?
A Note on Methodology
This article synthesises findings from over 80 sources, including industry reports from User Interviews, Figma, Maze, Nielsen Norman Group, UXPA, Roy Morgan, and Deutsche Bank Research; academic research; and community discussions from Reddit, LinkedIn, and design forums. Nearly 100 direct quotes were gathered and categorised by theme, sentiment, and source type.
Community voices have been de-identified to protect contributors who share vulnerable perspectives. Named attributions are reserved for public figures making statements on their own platforms or in public interviews.
This research was conducted in late 2025. The landscape is evolving rapidly; these findings represent a snapshot of sentiment at a particular moment in a fast-moving transformation.
Riley Coleman is the founder of AI Flywheel, a practice focused on human-centred AI design. They work with design teams and organisations to build AI experiences that earn and maintain user trust, while helping designers navigate their own relationship with these tools.
Frequently Asked Questions
Original research exploring the gap between what designers say about AI and what they feel—synthesising 80+ sources, 100 direct designer quotes, and 6 key findings on trust, fear, adoption, and the human remainder in AI design.
Six key findings: (1) Designers use AI tools they don't trust—80% adoption but 91% worry about accuracy. (2) Entry-level hiring collapsed 50% since 2019. (3) Efficiency gains don't reduce burnout. (4) Homogenisation fear is widespread. (5) 75% of community discussions mention job security. (6) Human value lives in context, judgment, and lived experience.
The research synthesises insights from 80+ sources including User Interviews, Figma, Maze, Nielsen Norman Group, UXPA, and Roy Morgan. It includes nearly 100 direct designer quotes and draws on AI Flywheel's data from 304+ designers across 7 cohorts.
The human remainder is what AI cannot replicate: embodied context, lived experience, ethical judgment, emotional understanding, and the ability to interpret meaning beyond data. Research shows human-AI combined outputs score 37% higher on originality than AI alone.
Adoption is largely compelled, not enthusiastic. 80% of UX researchers now use AI tools, but 91% worry about accuracy. The pressure comes from organisations wanting faster output and competitive positioning—designers are caught between institutional demands and professional instinct.
Entry-level design hiring has collapsed by 50% since 2019. Tasks that trained juniors—transcription, wireframing, synthesis—are now handled by AI. This breaks the apprenticeship model that built expertise through supervised repetition.
Designers fear AI flattens creative output toward 'competent, frictionless, forgettable' work. AI systems trained on averages pull work toward convention. Only 54% of designers believe AI improves quality, compared to 68% of developers.
Riley Coleman, founder of AI Flywheel, conducted this research. They're transparent about not being a neutral observer—they run a practice focused on human-centred AI design and have navigated their own creative relationship with AI over 12 years in the industry.
Yes. Please cite as: Coleman, R. (2025). 'Using What We Don't Trust: The State of AI in Design, 2025.' AI Flywheel. Available at: ai-flywheel.com/article/state-of-ai-design-2025
Mixed methods: secondary research synthesis from 80+ industry sources, primary observation from 7 cohorts (304+ designers), community discussions from Reddit and LinkedIn, and direct designer interviews. Nearly 100 quotes were gathered and categorised by theme and sentiment.
Want to go deeper?
Join the AI Flywheel community, where designers navigate AI together, with honesty and support.
Join the Community