How to Spot Them

Signature behaviours:

Always asks for a demo – Won’t touch it until they’ve seen it work.
Over-analyses accuracy – Breaks down results, errors, and edge cases.
Slow to adopt – Waits for proof before even small trials.
Voices AI’s limits – Brings up flaws and risks in every discussion.
Cautious with input – Selective with data shared due to privacy concerns.

What this means for you:

  • They protect the team from poor-quality tools and decisions.
  • They flag issues others ignore, from security to accuracy.
  • But they can stall progress and lower team confidence if their scepticism spreads.
  • With the right approach, they become your most thoughtful advocate for trusted AI.

The Challenges They Create

⚠️ High bar for adoption – Needs full proof before trying anything new.
⚠️ Slow to trust – Hard to convince without detailed transparency.
⚠️ Privacy-first mindset – May avoid using AI entirely if data feels unsafe.
⚠️ Fixates on flaws – One error can turn into a full rejection.

What to do

Don’t sell, show

  • Share verified case studies tied to real outcomes.
  • Use live demos to walk through results and edge cases.
  • Invite them to test AI in safe, controlled environments with real data.

Break down how AI works

  • Explain step-by-step how decisions are made.
  • Make limitations clear, don’t hide them.
  • Share how errors are caught, corrected, and improved over time.

Let them test before they trust

  • Pilot AI in low-risk tasks they can evaluate.
  • Offer review checkpoints so they stay in control.
  • Encourage side-by-side comparisons of AI vs human output.

What Success Looks Like

✔️ They endorse AI with confidence, not just compliance.
✔️ They raise the bar for reliable, accurate AI use.
✔️ They help teams use AI critically, not blindly.
✔️ They shift from blocker to believer, without losing their standards.


Heads up — this bit’s for members.

Get full access to it (and everything else inside The Workroom) when you join.

→ Become a Workroom Member Already joined? Sign in