CurricuLLM LogoCurricuLLM
In the ClassroomFeaturesPricingTraining HubDevelopersFAQ
AI for schools is now about trust, relationships, and the new expectation of immediacy
30 October 2025

AI for schools is now about trust, relationships, and the new expectation of immediacy

Dan Hart

Dan Hart

CEO, Co-Founder, CurricuLLM

There's a thread running through a lot of the AI news lately.

It's not just "AI helps learning" or "AI automates tasks".

It's that AI is becoming part of how students relate to information, to feedback, and sometimes even to companionship. That shifts the social texture of learning in ways we haven't had to think about at scale before.

This week's links range from classroom research to robots in homes to the collapse of old edtech models. Different topics, same underlying question.

What kind of relationship are we building between young people and machines.

Students' perceptions of chatbots will shape how they learn

The Conversation shared a useful piece on a simple idea.

Learning has always been social. Students develop through interaction, observing, questioning, and engaging with others. When AI becomes part of that process, it changes what "interaction" even means.

Researchers suggest students' views of AI will shape:

  • how much they trust information from chatbots
  • how they respond to AI tutoring or feedback
  • how they see AI as part of their social world

This is why "AI literacy" can't just be definitions and algorithm diagrams. It also needs to include how students interpret these systems and relate to them.

Do they see it as a tool. A teacher. A friend. A judge. A search engine. A shortcut.

Those mental models will drive behaviour.

Read more: Students' perceptions of chatbots will influence how they learn with AI

Robots in the home raise a different kind of boundary question

A new home robot called Neo is now available to order, and for now, many household tasks will be completed via teleoperation.

Meaning a human, somewhere else, is remotely controlling the robot inside your home.

It's a fascinating trade. You get utility. But you're also inviting an unidentified operator into private spaces, even if it's through a machine.

We've seen people trade privacy for convenience before, so I don't doubt this will find a market. But it raises uncomfortable questions we haven't really resolved:

  • what counts as "presence"
  • what rights and expectations apply inside a home
  • what this means for labour across borders

If a robot in Australia is operated by someone overseas, does that count as work within Australia. Would that person require a visa. We don't apply those rules to overseas developers or remote customer service, but physical presence through a robot blurs the line.

Watch: Neo home robot introduction

Chegg's collapse shows how fast student expectations are changing

Chegg announced a major restructuring, reducing its global workforce by about 45%.

They pointed to two drivers:

  • rapid adoption of AI tools by students
  • a steep drop in Google referrals to content publishers

This story is bigger than one company.

It shows changing student behaviour and a demand for immediacy. Static answer libraries are getting replaced by conversational systems that feel faster, cheaper, and more personal.

The challenge for edtech now is finding models that keep up with speed expectations while still supporting integrity, trust, and deeper learning.

That's not a minor design problem. It's the whole game.

Read more: Chegg layoffs - poster child for AI slashing staff and shares

Personalised learning is real, but it has to be more than a slogan

Personalised learning has been promised for decades, so I know most people's eyes glaze over when they hear it.

But this is still the direction that matters for AI for schools, because if you can't meet a student where they are, you end up optimising for the middle and leaving everyone else to cope.

In CurricuLLM, personalisation comes from two kinds of memory:

  • preference
  • progression

That combination is what lets conversations be unique for each student, while still staying curriculum-aligned and consistent. The goal isn't "a chatbot with vibes". It's support that actually tracks what the student knows, what they struggle with, and how they learn best, without big gaps appearing over time.

Data is the real blocker, not the model

I listened to a strong episode on AI readiness that basically reinforces what most implementation teams already know.

Data issues stop progress more than core tech.

Fragmentation, quality, and access are the real limits.

I also liked the warning about two traps:

  • the magpie chasing shiny pilots
  • the mountaineer freezing everything while waiting for a grand rebuild

The better path is pragmatic:

  • start with feasible, high-value use cases
  • invest in reusable foundations as you learn
  • use AI for entity cleanup and retrieval
  • build SOPs where they're missing
  • design new systems with metadata and access in mind
  • provide tooling that supports many builders (prompt tools, low-code, and full-code paths)
  • evaluate constantly with datasets and rigorous testing

This is as true for schools and education departments as it is for enterprises. If the data layer is weak, AI becomes a demo, not a capability.

Listen: AI readiness podcast on Spotify

Emotional attachment to chatbots is no longer a fringe issue

A YouGov survey reported that one in seven Australians could imagine falling in love with an AI chatbot, and one in five have already opened up to one emotionally. Young people aged 18-24 were the most likely to form attachments.

Educators and psychologists are raising concerns about wellbeing and healthy relationships, including the idea that "virtual friendships" can feel real while lacking emotional depth, and that reliance can activate reward systems in ways that look a bit like addiction patterns.

The best safeguard suggested is also the simplest: open, non-judgmental conversations between parents and children.

For me, the key distinction is this:

  • AI designed for learning can enhance understanding and access to knowledge
  • AI designed for friendship can reshape how young people experience relationships

As AI becomes a bigger part of communication, the case grows for regulation and design standards that support safety, transparency, and age-appropriate experiences.

Read more: Could you fall in love with an AI chatbot? Children most at risk

The labour market is already signalling what it wants, even if it can't hire it yet

Robert Half shared research showing employers are shifting expectations as routine tasks are automated.

Teams are focusing more on higher-value work:

  • technology and finance spending more time on innovation, complex analysis, and governance
  • HR focusing on process improvement, policy development, and data insights

But there's a mismatch:

  • 92% of employers seek AI and automation proficiency in new hires
  • 99% are struggling to secure the skills
  • high salary expectations and scarcity of qualified candidates are the top barriers

This matters for schools because "AI for schools" isn't only about tools in classrooms. It's also about preparing young people for a labour market that is moving faster than most curriculum cycles.

Read more: Australian tech teams race to hire AI talent as demand surges

Where I'm landing this week

The biggest change I'm seeing is the expectation of immediacy.

Students can now ask a question and get an answer instantly, in a friendly tone, with infinite patience. That's powerful. It also changes how trust forms, how feedback is received, and what "learning with others" looks like.

So for AI for schools, the job isn't just picking tools.

It's shaping relationships and norms:

  • teaching students how to calibrate trust in AI outputs
  • building AI literacy that includes social interpretation, not just technical facts
  • protecting human connection as a core part of learning
  • designing for deeper thinking, not just faster answers
  • being honest about where convenience trades against privacy or wellbeing

The tech will keep improving.

What we choose to normalise around it is the real decision.

Back to all posts
CurricuLLM Logo
CurricuLLM

AI for schools

Product

FeaturesPricingDevelopersUse CasesFAQ

Company

About usPrivacy policyStatusContact

Resources

Terms of useSupportTraining hubBlog