The AI readiness gap: What it is and how leaders can close it
Technology is moving faster than organisational culture, creating a widening "AI readiness gap." While AI tools are more intuitive than ever, many professionals are still hesitant to own their usage. Discover how leaders can move beyond shallow productivity gains to build a culture where AI truly augments human judgment and creativity.

This moment in technology is not a standard update. What we’re living through now is a primary shift in how work gets done, how we make decisions and how we create value.
As Adam Spencer put it during a keynote speech, quoting Justin Trudeau:
“The pace of change has never been this fast before — and it will never be this slow again.”
The message lands harder today than it did back then: changes are now compounding. For leaders, the challenge is a double-edged sword — one that demands innovation and experimentation on the one hand, and strong governance and risk management on the other.
Technology capability is moving fast, with huge leaps approximately every two years. Operating models and leadership priorities are not. The result is a widening gap between what AI has the potential to do and what it is actually being allowed — or trusted — to do in practice.
This gap is the AI readiness gap. And closing it is now a leadership responsibility.
Dismantling the 90/10 rule of technology adoption
Historically, around 90% of users extract only about 10% of a technology’s available value before hitting what some call the “wall of specialty knowledge.” You get far enough to be productive, but not far enough to transform daily workflows and operations.
AI disrupts that pattern — and yet creates a paradox in the process. On one side, AI tools are highly intuitive. User-friendly interfaces allow people to perform complex tasks without requiring training (for example, you don’t need to know how a language model works to ask it a question and get an answer). In many cases, the barriers to entry are almost completely gone.
On the other side, many are reluctant to own their AI usage. Despite the simple, accessible interfaces of these tools, 70% of professionals hesitate to voice whether or how often they use AI at work. The reasons include fear of judgement, fear of getting it wrong, aversion to policy breaches or not knowing where the limits are.
This is the adoption paradox. AI feels easy to use, but culturally, it’s hard to talk about.
For leaders, it creates a problem. If AI remains something people use experimentally or defensively, organisations will never achieve more than shallow productivity gains. The problem isn’t about the tech we’re adopting; it’s about the mindset.
Closing the gap requires an intentional shift from treating AI as a specialist capability reserved for IT or innovation teams, to treating it as a strategic implement for every role. AI has established its place in every function, from finance, legal and HR to operations and sales. It’s the leadership’s job to normalise that reality.
How to redefine work by augmenting the human element
One of the most prominent conversations around AI is whether it will replace people. As Adam said, “AI takes away some of the stuff you did at work that was never really your job to begin with.” Meaning, AI performs ancillary tasks rather than replacing actual human roles.
Distinguishing “replacement” from “augmentation” matters because when you remove repetitive or preparatory work, what’s left is human work that AI cannot replicate: judgement, creativity, building relationships and making decisions where the outcomes are unpredictable.
The augmentation factor is already delivering tangible results. Adam shared an example of Dell’s sales teams, which used AI to generate meeting briefings. They cut meeting prep time in half compared with pre-AI workflows, reclaiming 20% of their overall working hours. That time could then be redirected into more meaningful conversations, stronger client relationships and strategic thinking.
The same principle applies across knowledge work. Tools like NotebookLM can synthesise huge amounts of data and information into short audio summaries within minutes. What once took hours of reading can now be understood much faster.
With manual work automated, leaders have two options. They can lean into higher outputs or use that space to enable higher-level thinking that was previously held back by low-value workflows. The organisations that close the AI readiness gap consistently choose the latter.
Navigating data governance to streamline and operationalise AI
To illustrate the role of data governance, Adam uses a simple analogy: a high-performing organisation is like a powerful sports car and data is the oil that keeps it running. Clean, organised data is like 98-octane fuel, whereas siloed, inconsistent data is like pouring sand into the petrol cap and hoping for the best.
Without sound data foundations, no amount of integration can yield AI’s full potential. This is where good governance can enable performance, ensuring that innovation, when it happens, actually works.
Consider the Shadow AI crisis. Adam noted that a large number of employees use AI tools without their employer’s knowledge or approval. In isolated circumstances, informal AI experimentation may seem harmless. At scale, it poses serious risk — especially when individuals upload sensitive information to open models without visibility or control.
At the same time, the threat landscape is evolving. Agentic AI systems, which make decisions autonomously, deepfakes and avatars can blur the line between real and fabricated communications, causing potentially catastrophic organisational harm. Code can be written faster than humans can realistically review or explain. Leaders must be prepared for the future of that reality.
Future-proofing organisational continuity
Addressing the AI readiness gap isn’t just about the technology anymore. Leaders need to start planning for the workforce and succession challenges that come with the tech. Below are three insights Adam offered to help leaders future-proof their businesses in the age of AI.
-
Preserve proprietary knowledge and skills in employees
Over-automating entry-level roles can carry irreversible consequences down the line. When junior staff outsource formative work — the “hard yards,” as Adam framed it — they deny crucial opportunities to gain instinct, context and judgement in their roles. Therefore, organisations risk narrowing their future leadership pipeline by as much as 75%.
To protect a business’s intellectual framework, entry-level employees must continue to build the knowledge and skills required for senior decision-making later in their careers. The solution is to apply automation with discernment. Junior employees should still do the thinking, while AI accelerates learning, feedback and exposure.
-
Leverage AI for market access
For Australian organisations in particular, real-time transcription and translation capabilities can break the language barrier, allowing teams to compete in international markets instantly. Businesses can transcend national borders without waiting years to build linguistic capability. And that advantage is available now.
-
Make AI a cultural mandate
To ensure AI augments, rather than replaces, human expertise sustainably, leadership should make it a cultural mandate. That means continuous learning about capabilities, open conversations about use and transparency around where AI fits — and where it doesn’t. The human skill and expertise base must remain in the work that people are doing.
Leading toward the 2030 horizon
With Artificial General Intelligence potentially emerging by 2030, the window for cultural adaptation is narrowing and the direction is clear: Leaders who delay action because they expect certainty will find themselves managing risk rather than shaping opportunities.
Adam pressed that closing the AI readiness gap requires a shift in focus. It’s not just about the tools anymore. We need to evolve how work happens. Data should be connected and operationalised, not just protected. On the other side of AI’s efficiency gains is a need to preserve human judgement and critical thinking.
The executives who succeed will be those who ensure their organisations remain deeply human while wielding one of the most powerful tools in human history.
Now is the time to operationalise your data while mitigating risk. Contact us today to learn how Iron Mountain InSight DXP and our information governance services can align innovation with compliance.
Related resources
View More Resources

