The Gap That AI Tools Don't Close
AI tools are powerful, but access alone doesn't drive adoption. Research shows most employees use less than 20% of AI capabilities. The real gap is understanding.
Tim Clark
Co-founder · 1 April 2026 · 5 min read
TL;DR
Most businesses give their team AI tools and expect adoption to follow. It doesn't. The gap between access and understanding is where the real value gets lost. One conversation can shift someone from using 10% of what's possible to genuinely transforming how they work.
A few months ago I was talking to a friend who’s properly into AI. Not just using it occasionally. Daily user, follows all the releases, actually reads the documentation. The real deal.
We got chatting about what he was doing with it and I noticed something. He was using it well for the things he knew about. The obvious applications, the stuff you figure out in the first couple of weeks. But there was all this other stuff, capabilities that would genuinely change how he works, that he just hadn’t come across yet. Not because the tools couldn’t do it. He just didn’t know to look.
This is someone who’s paying attention. Actively engaged. And still, blind spots.
According to a 2024 McKinsey survey, 72% of organisations have now adopted AI in at least one business function, yet most employees still use only a fraction of what these tools can do. It made me curious about what happens when a company rolls out AI to a whole team. Everyone gets access to ChatGPT or Claude, maybe a bit of training on the basics, and then off they go. A few people dive in and never look back. They’re the ones who were probably already curious, already tinkering with things in their own time. Most people try it a couple of times, don’t get much out of it, and quietly return to whatever they were doing before. The tool sits there. Available but not really used.
Six months later someone in leadership asks why the AI investment hasn’t delivered what they hoped for. Usually the answer isn’t that the tool was wrong or the team was resistant. It’s simpler than that. Nobody really learned how to use it properly in the first place.
The tool sits there. Available but not really used. And six months later someone asks why the AI investment hasn’t delivered.
Permission to explore
There’s something that happens in a good training session that I didn’t expect when we first started running them. It’s not really about the content, although the content matters. It’s more like permission. A shared space where people can admit they don’t know things and ask questions that might seem basic. Someone mentions a prompt that worked well for them. Someone else builds on it, tries a variation. By the end of the session there’s actual momentum building. People swapping ideas, making plans to try things, asking each other questions they wouldn’t have thought to ask before.
I’ve watched teams come out of sessions genuinely excited. Not in a forced, corporate enthusiasm way. Actually curious about what they might be able to do. They go back to their desks and start experimenting. They message each other when something works. There’s a bit of friendly competition sometimes, people wanting to find the clever application that nobody else has thought of yet.
That’s what adoption actually looks like. Not usage statistics or licence counts. People wanting to explore.
Understanding unlocks possibility
Any tool is only as useful as your understanding of what it can do. A fancy camera doesn’t make you a photographer. A professional kitchen doesn’t make you a chef. The tool creates possibility. Understanding unlocks it.
AI is particularly opaque in this way. The possibilities aren’t obvious from the interface. There’s no manual that tells you everything it can do, partly because what it can do keeps changing and partly because the applications depend so much on your specific context. You have to be shown, or stumble across it yourself, or hear about it from someone who’s already figured it out. Otherwise you use about ten percent of what’s possible and assume that’s all there is.
I think about my friend sometimes. Genuinely engaged, using AI every day, keeping up with all the developments. Still had these blind spots, capabilities sitting right there that he wasn’t using. Once we talked through some different approaches, everything shifted. Not because the tool changed. He just understood more about what he could ask of it.
He’s doing much more interesting things with it now. Building things he wouldn’t have thought to build before. All from one afternoon of conversation.
The tools only meet you as far as your understanding takes you. Everything beyond that stays invisible until someone helps you see it.
The tools are good. Getting better all the time. But they only meet you as far as your understanding takes you. Everything beyond that stays invisible until someone helps you see it.
Wondering what your team might be missing? Our AI Clarity Session helps you identify the gaps between what your tools can do and what your team actually knows. Or explore our real-world AI use cases to see what becomes possible when people truly understand the tools they’re working with.



