CurricuLLM LogoCurricuLLM
In the ClassroomFeaturesPricingTraining HubDevelopersFAQ
What AI tutors mean for AI for schools and why teachers still matter
18 November 2025

What AI tutors mean for AI for schools and why teachers still matter

Dan Hart

Dan Hart

CEO, Co-Founder, CurricuLLM

AI tutors are getting good. Like, properly good in some contexts.

But the more I read, the more I'm convinced the story isn't "AI replaces teaching". It's "AI changes what teaching is for". And it puts more pressure on us to design learning on purpose, not just drop a chatbot into the mix and hope for the best.

This week I read a few pieces that connect nicely. They all point to the same tension: AI can lift learning, but it can also flatten it if we let it do the thinking.

AI tutors can lift learning, but only when they're designed to make students think

A piece by Dr Ari Pinar in RenewED looks at AI tutors and what they might mean for the future of teaching. It pulls together tutoring research, intelligent tutoring systems, and the newer wave of large language model tutors.

It references studies (including a Harvard GPT-4 physics trial) suggesting AI tutors can match or exceed traditional classroom instruction on measured learning gains in some cases, especially for early exposure to new content.

But the important part is the warning.

Unguided use of general AI tools can encourage shortcuts and cognitive offloading. The system does the mental work, students feel productive, but their actual understanding can get thinner.

Well-designed tutors behave differently. They use scaffolding, feedback, and productive friction so the student is still doing the thinking.

Key ideas that landed for me:

  • AI tutors can deliver significant learning gains when grounded in evidence-based teaching practices
  • Poorly designed AI use can harm learning by replacing thinking, not supporting it
  • The most promising systems lean on cognitive science and mastery learning
  • AI is strongest as a supplement to human teaching, not a substitute
  • Teachers stay essential for motivation, mentoring, and the social side of learning
  • The real opportunity is letting AI handle routine instruction while teachers focus on higher-order thinking, connection, and whole-child development

That's a very practical frame for AI for schools. The question isn't "do we allow AI". It's "do we design AI use so the student still has to build the mental model".

A bigger shift is happening underneath, what is school *for* when AI can generate everything

A paper from University of Cambridge researchers in the British Journal of Educational Technology pushes the conversation up a level.

Their argument is basically: generative AI doesn't just change tasks. It challenges the purpose of schooling when systems are tightly linked to print literacy, individual essays, and exam performance.

They frame GenAI as both a risk and a remedy, depending on how we design pedagogy and assessment.

What stood out:

  • GenAI challenges education models built around individual production of print-based work
  • AI can be a dialogic partner that expands dialogue, rather than replacing thinking
  • They propose a "double dialogic pedagogy"

- thinking through dialogue in small groups - inducting learners into long-term cultural and disciplinary conversations

  • AI could support collective intelligence by grouping learners, coaching discussion, and staging structured collaboration

I like this because it's not a "ban it / embrace it" take. It's saying: build a learning ecosystem where students learn to think with others, with tools, and across time.

Paper link: Read the full paper on British Journal of Educational Technology

Knowledge production is being outsourced and education needs to hold the line on values

A Conversation article makes a point that feels obvious once you see it.

GenAI tools are becoming co-creators in learning. That creates tension between:

  • efficiency for students
  • depth of learning for educators
  • engagement incentives for tech providers

The risk is that knowledge production gets shaped by a small number of private companies rather than educational values.

This is one of the reasons I built CurricuLLM. Curriculum gives clarity on what students should learn, supports progression, and keeps consistency and fairness across schools. It's a way to anchor AI for schools in something stable, rather than letting the model's defaults become the "curriculum".

Link: Read more: How generative AI is changing education

The community lens matters, care, co-design, and real local problems

I also joined a panel hosted by Sally Cripps with Jacky Hodges, Jill Bennett, and Stephen Hardy at Government House for the Royal Society of NSW.

It was a good reminder that AI isn't only an education story. It's a community story.

We talked about:

  • delivering services in rural and remote communities
  • co-design for older people and people living with dementia
  • how work changes when AI becomes part of everyday tasks
  • what it takes to introduce AI safely and responsibly in schools
  • early examples of how students and teachers are using these tools in practice

Full conversation: Watch on YouTube

Safety isn't just policy, it's also understanding what the model is doing

OpenAI published work on making neural networks easier to understand using sparse models.

The idea is that if most connections are set to zero, the model can form smaller circuits that are easier to trace and explain. They show examples where specific circuits are necessary and sufficient for certain behaviours.

Why this matters for education is indirect but important.

If we want to deploy AI tools at scale, including in classrooms, we need more than "it seems to work". We need better ways to analyse, debug, and evaluate behaviour. Interpretability work like this is part of building systems that are more trustworthy.

Link: Read more: Understanding neural networks through sparse circuits

Most organisations are still using AI shallowly, and that will be true for schools too unless we build capability

An AFR piece covering Reserve Bank research suggests adoption in Australian businesses is still early and often shallow.

Key points:

  • nearly 40% of firms report minimal AI use
  • most adoption is limited to simple tasks like email summaries and research
  • tech spending is up, mostly driven by cybersecurity
  • productivity gains linked to AI haven't shown up yet

My take is the same as always. Value doesn't arrive with a licence. You have to adapt processes and build skill to unlock it. That's true in business, and it's true for AI for schools as well.

Link: Read more: RBA survey reveals shallow AI adoption

A quiet reminder that the model race keeps moving

GPT-5.1 dropped without much fanfare.

Which is a small thing, but it's also the point. These tools will keep improving in the background. Schools and systems can't treat this as a one-off "AI moment". It's an ongoing shift.

Link: Read more: Introducing GPT-5.1

Where I'm at after all this

AI tutors can lift learning. That part is real.

But outcomes depend on design.

If we use AI to remove the mental work, learning thins out. If we use AI to scaffold, prompt, and keep students engaged in the hard parts, learning can improve.

The best future I can see is simple:

  • AI handles routine instruction and repetition
  • teachers spend more time on motivation, mentoring, relationships, and higher-order thinking
  • assessment rewards process, dialogue, and judgement, not just polished output
  • schools hold the line on values so learning isn't quietly outsourced to whatever the big platforms optimise for

That's the version of AI for schools I'm excited about.

Back to all posts
CurricuLLM Logo
CurricuLLM

AI for schools

Product

FeaturesPricingDevelopersUse CasesFAQ

Company

About usPrivacy policyStatusContact

Resources

Terms of useSupportTraining hubBlog