CurricuLLM LogoCurricuLLM
In the ClassroomFeaturesPricingTraining HubDevelopersFAQ
What's Actually Changing in AI for Schools in 2026 and What Matters Most
17 January 2026

What's Actually Changing in AI for Schools in 2026 and What Matters Most

Dan Hart

Dan Hart

CEO, Co-Founder, CurricuLLM

If you work in education right now, you can feel the shift. Not in a single "big bang" moment, but in a steady change to what students and teachers expect from technology, what parents are dealing with at home, and what schools are being asked to provide. "AI for schools" used to mean experimentation. Now it's drifting into the category of normal.

This week I read a few pieces and reports that, together, painted a pretty clear picture. AI is already in kids' lives. Teachers are adopting it faster than most other professions. Parents are using it too, often alongside their kids. And the conversation is starting to mature from "how much time will this save?" to "what kind of learning and culture do we want to build?"

AI is already baked into kids' lives, whether schools choose it or not

One of the most grounded takes I've seen lately came from Brookings, in a piece about raising resilient learners in an AI world. The main point is simple but important: if kids have devices and internet, AI is already part of their day. A lot of that happens outside school, in the messy reality of homework, social apps, and curiosity scrolling. That means parents end up as the front line guide, even when they're stretched thin and not sure what "good" looks like.

What I liked most was the focus on skills that can fade if AI does too much of the work. Critical thinking. Persistence. The ability to sit with something hard without immediately grabbing the fastest answer. And the social side too, making sure real human interaction doesn't get swapped out for a chatbot that always agrees. Schools can't fully control what happens at home, but they can absolutely give families shared language, shared expectations, and practical boundaries that are calm rather than reactive.

If you want the article and the resources Brookings links to, it's here: https://www.brookings.edu/articles/tips-for-parents-raising-resilient-learners-in-an-ai-world/

The big adoption story: learning is becoming the number one reason people use AI

Google's 2026 report on AI adoption made something feel official that many of us already suspected. For the first time, "learning" is the primary reason people use AI. They reported 74% of users relying on AI to understand complex topics. In education specifically, the numbers are high: teachers adopting at 81%, students at 85%, and parents at 76% (often learning alongside their children). Even more interesting is the optimism: 67% of teachers believe AI will improve the quality of their teaching, and 63% expect better student outcomes. Overall, 73% believe AI has a net positive impact on learning.

Those numbers aren't a victory lap. They're a signal. If AI for schools is now mainly about learning (not just productivity), then the bar changes. Schools need tools that support thinking, not shortcuts. Professional learning needs to be practical, not theoretical. And leadership needs to treat AI like a literacy and curriculum issue, not just an IT procurement question.

Report link here: https://static.googleusercontent.com/media/publicpolicy.google/en//resources/our_life_with_ai_2026.pdf

The underrated use case: AI that supports empathy, tone, and relationships

I also read an article about using AI to build empathy in schools, and it landed for me because it matches how I use these tools in real life. Most people talk about AI saving time. That's real, but I think there's another layer that deserves more attention: AI can help us be more intentional with our words.

I use AI to practice for difficult conversations, or to check whether an email sounds supportive rather than just "professional." It acts like a mirror. It shows me where I might be missing someone else's perspective, or where my tone is sharper than I intended because I'm busy. That doesn't replace relationships. It doesn't do the human work for you. But it can help you do the human work better, especially when school life is full and fast and emotionally loaded.

Here's the article: https://www.eschoolnews.com/digital-learning/2026/01/13/ai-for-empathy-using-generative-tools/

Classroom usage is rising fast, but the "non-adopters" matter too

EdWeek shared data that AI use in classrooms has almost doubled since 2023, going from about 34% of teachers using it to 61% in two years. That's a huge shift in a short time. But what caught my attention wasn't just the growth. It was the 21% who say they won't use it at all.

That group is easy to dismiss, but I think they're worth listening to. Sometimes "I won't use it" is really "I don't have the support, the time, the training, or the confidence." Sometimes it's a values call. Sometimes it's a reaction to poor tools that don't fit the classroom reality. If we want AI for schools to be equitable, we can't build a future where only the early adopters get the benefits and everyone else is left behind or quietly judged for opting out. The job isn't to force everyone into the same behaviour. It's to create the conditions where good use is possible, safe, and aligned with learning.

EdWeek link: https://www.edweek.org/technology/more-teachers-are-using-ai-in-their-classrooms-heres-why/2026/01

AI for schools isn't only "teaching and learning" — it's also infrastructure

One of the most practical reminders this week was seeing the NSW Department of Education schools web facelift, including an AI-powered search capability built by a team I used to work with. That's the kind of AI that rarely gets called "transformational" in headlines, but it matters. It makes services easier to access. It reduces friction for families. It helps people find the right information without needing to know the exact terminology the website expects.

That's an important part of the AI for schools conversation. Not every AI use case is a classroom assistant. Some of the best wins are in the background: search, triage, forms, knowledge bases, and the everyday "how do I find the thing" moments that eat time and create frustration.

Link here: https://www.itnews.com.au/news/nsw-department-of-education-schools-web-facelift-driving-enrolments-622701

Zooming out: AI adoption is growing, but the gap is widening

Finally, Microsoft's AI Economy Institute report looked at global AI usage across countries. One figure that stuck with me was global use reaching about 16% by the end of 2025, roughly one in six people using generative AI tools. Adoption is rising, but the gap between regions is widening: about 25% of the working population using AI tools in the Global North, compared to 14% in the Global South.

Australia is positioned as a relatively high-usage country (they put it at 11th), with use growing from 35% to 37% across 2025. They also call out UAE and Singapore as leading adoption, South Korea's big jump due to policy and language-model improvements, and the interesting split where the US leads in building the technology but ranks lower in usage than you might expect.

For schools, this matters because "AI for schools" is also an equity conversation. If access, language support, and training aren't addressed, the gap won't just be between countries. It'll appear between schools, between communities, and even between students in the same classroom.

Report link: https://www.microsoft.com/en-us/corporate-responsibility/topics/ai-economy-institute/reports/global-ai-adoption-2025/

Where I'm landing this week

Put all of this together and you get a fairly clear direction. AI for schools isn't a single product, policy, or lesson plan. It's an ecosystem shift.

  • Parents need calm guidance and shared language, not guilt and panic.
  • Schools need structure: what good use looks like, how we protect thinking, and how we build social learning in an AI-rich world.
  • Teachers need practical support and tools that fit real workflows, plus permission to start small.
  • Tool builders need to make it easier to use AI well, not just easier to use AI more. That means transparency, better defaults, and designs that nudge learning rather than shortcuts.

Kids don't need a perfect rule set. They need adults who can talk about AI without freaking out, set some boundaries, and keep the focus on learning how to think. That's the heart of "AI for schools" for me right now.

Back to all posts
CurricuLLM Logo
CurricuLLM

AI for schools

Product

FeaturesPricingDevelopersUse CasesFAQ

Company

About usPrivacy policyStatusContact

Resources

Terms of useSupportTraining hubBlog