Skip to main content
Back to portfolio

Designing for sceptical adoption: How user agency drove AI acceptance

Company name confidential

Read time: 9 minutes. Overview: 2 minutes.

A product transformation story: From optimisation brief to strategic pivot and discovering fundamental challenges that reshaped direction

AI/ML Change management Design sprint Prototyping Usability Strategic UX User research Stakeholder management

Overview

Impact

  • Prevented years of sunk cost in an unviable partnership
  • Informed board-level strategy with rigorous, repeatable research evidence
  • Elevated UX research from tactical optimisation to a driver of business transformation

Reflection

This case demonstrated that when systemic architecture and identity concerns block adoption, no amount of UI polish will succeed. UX leadership requires the courage to deliver uncomfortable truths and the rigour to back them with data, turning research into boardroom strategy.

Context

This work is confidential so for the purposes of this case study, I will refer to my industry leading client as 'IndustryCorp' and their AI startup partner as 'TechStartup'.

Client demand for professional services was rising, but there was no increase in professionals entering the industry. This shortage was increasing over time, creating a critical capacity gap. IndustryCorp made a significant investment in TechStartup, to address this challenge. In industries where highly educated professionals serve growing client bases, the question becomes: can AI help professionals serve more clients without compromising quality?

Challenge

How do you convince expert professionals to trust AI with their most important decisions when they view it as direct threat to their expertise and client relationships. These professionals needed to double their client capacity to meet growing demand, but after 3 months of use, the System Usability Scale (SUS) scores had declined by 14 points. This was clear evidence of growing user frustration with fragmented workflows.

Success criteria

  • Measurable increase in client capacity without quality degradation
  • Professional acceptance and advocacy for AI tools
  • Maintained client satisfaction and relationships
  • Evidence-based validation of AI decision quality

Discovery

What began as a brief to optimise an existing AI solution revealed fundamental product strategy problems:

  • AI acceptance was lower than expected
  • The solution depended on software from both companies being a visible part of the user flow
  • TechStartup's user interface was often incomprehensible to users.
6 Professionals interviewed
7 client sessions observed
1440+ sessions of data captured
1 major business shake-up

My role

As UX lead for discovery, research and product shaping I was engaged by the industry leader to:

  • consult internal stakeholders
  • evaluate the user experience of the existing technical implementation
  • identify barriers to adoption
  • conduct qualitative analysis and team mentoring
  • train staff in research planning, facilitation and synthesis
  • facilitate design sprint
  • prototype solution and usability test
  • shape the product design based on user research insights.

Research

Critical research insights

Adoption enablers

AI outcomes proven

Equal or superior accuracy in client testing, with less than 5% of output less effective than the professional.

Data advantage realised

AI accessed broader datasets than any individual professional could in their career.

Revenue potential confirmed

Efficiency gains enabled serving twice as many clients.

Adoption barriers

My research began with the assumption that we had a fundamentally sound product requiring refinement and polish. However, evaluation revealed that the challenges weren't surface-level optimisation issues, they were fundamental problems requiring strategic consideration.

Identity threat

Highly educated professionals at the top of their field saw AI as a challenge to their expertise, prestige, and professional worth. They couldn't accept that AI could make better decisions than they could, they feared losing human connections with clients and they ultimately feared the loss of their industry as they knew it.

Fragmented experience

Even professionals open to AI became frustrated by the disjointed workflow. Users had to go back and forth between IndustryCorp's industry-leading software and TechStartup's AI software.

User interface problems

Even professionals open to AI assistance became frustrated by the fragmented experience. The system required switching between the industry leader and the AI startup software to complete tasks, creating cognitive overload and workflow interruption.

  • Unclear progress indicators
  • Unclear functionality
  • Unprofessional interface compared to industry standards

The strategic revelation

What seemed like an optimisation project revealed the need for complete product strategy rethinking. AI resistance was still a core issue, but the workflow fragmentation and TechStartup's interface were bigger barriers to using the technical implementation.

Core design principles extracted from research and stakeholder feedback

Through careful analysis and affinity mapping, we identified that professionals needed:

  1. agency in decision-making
  2. transparent comparison between AI and human judgement
  3. evidence of superior outcomes through real-world validation
  4. an integrated workflow that didn't fragment their process
  5. seamless tool switching or elimination of platform juggling.

Stakeholder response

They took they news pretty hard. There was a business agreement in place that the two applications were visible in the process. The AI couldn't be used in the background.

The directive for the design sprint was to do our best without fundamentally changing the set up.

Research methodology – how I obtained those insights

Methods

  • Stakeholder consultation
  • User interviews
  • Professional and client observations
  • AI decision-making acceptance and resistance
  • Usability testing
  • System Usability Scale (SUS)

Key areas of inquiry

  • Current process mapping (steps, time, pain points)
  • AI decision-making acceptance and resistance (measured through structured questions about trial satisfaction, feelings about AI, process changes, scale-up intentions, and peer recommendation likelihood).
  • New process testing and validation
  • Future state visioning and ideal workflows

Design solution: Preserving agency and smoothing the flow

Fragmentation it is

After revealing the fragmentation problem and receiving the directive that both applications had to be visible, I conducted a design sprint to turn those constraints and findings into the best solution possible, given the situation.

1

Understand

  • Go over the sprint format and the agenda for the day
  • Gave a refresher talk to the team about the research findings and business constraints
  • Get input from the resident subject matter experts
  • Discuss user needs and map them out
  • Conduct 'How might we..." activity
  • Set some goals for the sprint
"How might we" notes. Exploring diverse ways to solve for user needs.
2

Ideate

  • Research reminder
  • Business needs reminder
  • Review the 'How might we...' notes
  • Generate ideas
  • Concept card sketching
  • Feature prioritisation based on user needs
The design sprint team exploring solution sketches. Collaborative ideation involving subject matter experts helped secure stakeholder alignment.
3

Decide

  • Refine ideas
  • Traffic light activity to help refine
  • Sketch out flows and key user interface elements

How we dealt with the fragmentation

As we had no control over what happened in the AI software, the idea we chose to prototype offered additional support in the leading software, letting users know when they were about to go into the AI software, giving them an update of what happened in the AI software and what their next steps were.

4

Prototype

  • Create wireframes
  • Share with the team
  • Create prototype
  • Refine as needed
5

Validate

  • Usability testing of prototype with users to gain granular insight into remaining friction points and see if it was enough to smooth over the user flow problem.
  • Analyse test data
  • Present findings and capture actions

Results

Design sprint usability testing results

The extra support we added in the leading software was clearly working and provided a significantly better experience, but we received the same types of comments about software switching and the AI software's user interface, validating our hypothesis that user experience fragmentation was a critical barrier to adoption.

User feedback on fragmented user flow

"Visually if you can make those screens like IndustryCorp's that would be better. One piece of software is seamless, why leave, doesn't make sense."
"It'd be a faster flow in 1 software."
“Would love for it to all be in one software... I would hate for us to throw too many systems at staff. That’s getting old....let’s simplify!”

User feedback on interface problems in TechStartup software

"Do this (give some instruction), like in IndustryCorp software, to give some guidance"
"I am not sure what those are... am i suppose to check those?"
"I’m like, now what do I?"
"Next or more than sign is weird."
"Outdated and super small font. It doesn't match the look and feel of the one i just came from."

AI resistance

"An experienced person might be irritated because they like to be in control. No amount of convincing will change their mind."
"It was more the AI piece that was the new and potentially scary piece."
"It’s going to tell me that I don’t know what I’m doing!"
Quotes a previous conversation with a colleague: "So what are you going to do in your next career when this takes over? Because I’m sure I’m going to be out of a job soon and so are you."

AI positive

Agency: "I still get to choose. It’s my decision."
Knowledge and result improvement: "It is a very powerful tool… a better decision-making tool."
"I thought it was cool. I think we need to be part of something that’s going to change things.""

Financial positive

"How does it change the volume (of clients) that I might be able to have?"
"We cannot keep going the way that it is and be able to be a strong, viable option, so something has to change."

IndustryCorp response: Strategic decisions driven by research

IndustryCorp could see the huge gap between the experience presented by our prototype for them and how their partner's software presented. They requested a list of recommendations they could share with TechStartup, so that they could improve their software as well. Understanding TechStartup's sensitivity about the visibility of their roduct and brand, I made modest recommendations that were focused on the usability.

TechStartup felt that the users would learn to work with it and felt these changes were unnecessary. The industry leader pointed out that after 3 months, the users were still struggling, there was still some resistance to using the AI and that to have a successful launch the solution needed to be at a higher standard.

The startup was unwilling to:

  • be bought out
  • agree to a solution where the AI runs in the background
  • collaborate towards a more professional, usable, launch-ready solution.

The research findings and their partner's unwillingness to collaborate based on evidence influenced major business decisions.

Partnership restructuring

Based on the repeatable results shown in research and the startups unwillingness to make improvements on their end, the larger company decided to end the relationship with the startup, claimed a loss on licensing costs and started an development of a different, integrated AI solution internally, rather than continue with the fragmented partnership.

Reflection

What made this work

Delivering uncomfortable truths

Research findings that challenge expectations can initially meet resistance, but data-driven insights ultimately drive better strategic decisions.

Cross-functional design sprint

Collaboration with subject matter experts enabled buy in from key stakeholders and testing during this sprint provided validation of an integrated solution concept.

Rigorous longitudinal methodology

Research over time with consistent measurement points provided credible evidence for strategic decisions.

International team coordination

Training US colleagues in research facilitation ensured consistent data collection across locations.

Lessons learned

User experience spans system boundaries

When fundamental architecture problems exist, surface-level improvements won't drive adoption. Individual software usability means nothing if the workflow across platforms is fragmented. Sometimes the solution isn't optimising components, it's rethinking the entire system.

Strategic UX thinking drives business decisions

Understanding how user experience problems connect to partnership and development strategies expanded the impact of research insights far beyond interface improvements.

Continue delivery of honest results

The declining SUS scores were initially disappointing, but became crucial evidence for strategic decisions.

Integration beats optimisation

Sometimes the solution isn't improving individual components but rethinking the entire system architecture.

Next project

Cochlear Custom Sound Pro

Global usability strategy for healthcare technology

View case study