One of the themes that keeps coming up for me is how easy it is to confuse access with adoption.
We roll out a tool. We run a pilot. We write a policy. Then we're surprised when usage is uneven, impact is mixed, and momentum fades.
Most AI projects don't fail because the model isn't smart enough. They fail because the goal was framed too small, or because nobody owned the hard middle bit, turning "strategy" into "this helps me on Tuesday".
This week's links all orbit that same point, across business, education, and everyday life. The next phase isn't about better tech. It's about better leadership.
A quick side note for UK data science folks
Another great role for those interested, working with my buddy Greg Hodgson.
I recently caught up with John and the team and I was impressed by how methodical and well planned their approach to data science is. If any of my data science colleagues in the UK are looking for work, I'd recommend applying.
(Keeping it short here because the bigger theme of this post is adoption, not recruitment.)
Most AI projects fail because the goal is wrong
John Winsor and Sangeet Paul Choudary wrote a piece in Harvard Business Review making a point that matches what I keep seeing.
Most AI projects fail not because of poor technology, but because they are framed with the wrong goals. Companies focus on efficiency instead of transformation.
They argue:
- AI should drive reinvention, not automation
- the biggest gains come from rethinking architecture, not upgrading tools
- the challenge is strategic alignment, not technical execution
And the best line is the real question isn't how to join the 5% of successful projects. It's how to define success in the first place.
That's a useful lens for schools too. If AI for schools is framed as "save time on worksheets", it'll stay small. If it's framed as "reshape learning support and equity", it becomes strategic.
Read more: What executives get wrong about AI
Managers are the missing link between AI strategy and real adoption
EdTech developers, this one's for you.
As I've been thinking about integrating chat-based AI into teams, I keep coming back to the same idea. The real opportunity sits with managers.
Most rollouts stall because there's a gap between access and adoption. We make AI available, but we don't make it obvious how it helps in the specific workflow of a specific team.
Managers are the ones who can close that gap.
They know where time gets lost, where processes feel heavy, and where the bottlenecks are. They can connect "AI" to "this task" in a way that makes sense.
Gallup found employees are over six times more likely to find AI useful when their manager actively supports its use, yet only 28% say that support is happening.
Managers don't need to become AI experts. But they do need:
- time to explore
- guidance relevant to their function
- training that gives confidence to lead change in the workflow
If we want to realise value, we need to equip the people closest to the work. That's how AI stops being a shiny tool and starts becoming a capability.
Read more: Manager support drives employee adoption
AI in education will widen gaps if support stays uneven
A new study from the University of Washington (covered in The Conversation) highlights something schools already know in their bones.
AI in education is developing faster than some schools can keep up.
Some teachers are seeing real benefits, saving time on activity planning, marking, and resource creation. Others feel unprepared and lack the training and support to use AI effectively.
When professional learning, time, and access differ across schools, new technologies can deepen inequality. Well resourced schools explore and adapt. Others get left behind.
Teachers in the study also emphasised the obvious but important point: relationships still matter most. AI can provide information, but it cannot replace the human connection that drives learning.
This is why I keep saying AI for schools isn't only a technology decision. It's a capability and leadership decision. If school leaders don't actively build shared practice, the default will be uneven adoption and uneven outcomes.
Read more: AI could worsen inequalities in schools - teachers are key
AI's impact isn't abstract anymore, it's personal
An ABC News feature shared stories from Australians about how AI is reshaping daily life. Law, education, design, health, translation. People described opportunities and efficiency, but also displacement, uncertainty, and that feeling that "the future is happening without us".
Themes that stood out:
- jobs and skills evolving faster than people can adapt
- AI tools reshaping creative and professional roles
- uneven access to knowledge and opportunity
- a growing need for ethical design, regulation, and education
What I took from it is simple. Building AI responsibly isn't just a technical task. It's a social one. It changes identity, confidence, purpose, and the feeling of control people have over their future.
That's why change management isn't a side quest. It's the main quest.
Read more: Artificial intelligence rise - Australia readers concerns
Reconnecting the dots from NSWEduChat to what comes next
I also really enjoyed reconnecting with my colleague Jacky Hodges to talk about how NSWEduChat came into existence, and AI in education and the workplace more broadly.
It was a nice reminder that "successful AI" usually doesn't start with a model. It starts with a real problem, a group of people willing to iterate, and leadership that can hold the line through the messy middle.
Where I'm landing this week
Across business and education, I'm seeing the same pattern repeat.
- Tools are getting better fast
- Adoption is still uneven
- The limiting factor is not capability, it's leadership and support
If you want AI to scale, you need someone to translate ambition into practice.
In organisations, that person is often the manager.
In schools, it's leaders and teacher champions who create the shared language, the safe norms, and the professional learning that turns "AI is here" into "this is how we use it well".
AI for schools won't be won by whoever has the fanciest model.
It'll be won by the systems that help teachers feel confident, supported, and connected, so the technology strengthens learning instead of widening gaps.

