AI for schools is having a weird moment. On one hand, we're seeing real learning gains in places where traditional resources are scarce. On the other, we're watching AI slip into classrooms faster than our ability to teach students how to interpret it safely. And behind the scenes, the hardware powering all of this is evolving so quickly that what feels "cutting edge" today becomes normal much sooner than anyone expects.
This week's reads landed on three connected themes for me: equity, literacy, and acceleration. Equity, because the biggest upside of AI in education may be in regions where there simply aren't enough teachers, books, or support services. Literacy, because students can't just be passive users of black-box systems. And acceleration, because new compute platforms are quietly changing the economics of AI, which will affect what tools schools can access next.
The learning crisis is real, and AI-enabled tools are showing gains
The World Bank recently shared data on the learning crisis in Sub-Saharan Africa, where 86% of children struggle to read a basic text by age 10. That number is hard to sit with. It's also a reminder that AI for schools can't only be a conversation about convenience features in high-income settings. In many contexts, the baseline problem is access to quality learning support, full stop.
The encouraging part is that in places like Nigeria and Kenya—where traditional resources are limited—AI-enabled tools are starting to show learning gains. The World Bank piece is essentially a synthesis of studies pointing to improvements when AI-supported edtech is designed for local needs and realistic constraints. That matters, because "AI in education" isn't a single thing. Outcomes depend on implementation, language support, teacher involvement, and whether the tool fits the reality of the school system it's entering.
Good summary and link here: https://blogs.worldbank.org/en/education/the-future-is-africa--shaping-ai-enabled-edtech-for-skilling-the
Permit me to geek out: why faster chips matter for AI in education
Permit me to geek out for a moment.
Nvidia's Rubin platform is a big deal for AI training, especially for transformer models. This isn't just "a faster GPU." It's a rethink of how compute, memory, and networking work together so models spend less time waiting and more time training. Compared to Blackwell, Rubin is designed to keep models fed with data, and that matters a lot for modern workloads like long context, mixture of experts, and heavy reasoning.
The raw performance uplift is eye-catching, but the part I find most important is bandwidth and efficiency. Faster memory, faster interconnects, and better scaling changes the economics of training. If you can train the same massive models with fewer GPUs, less power, and less time, you don't just make hyperscalers happy. You reshape what's viable for everyone else building models and applications on top of them.
Why does this matter for AI for schools? Because capability tends to trickle down. Hardware breakthroughs make model training cheaper, which makes frontier-grade models more available, which makes more powerful education tools possible at lower cost. Not instantly, and not evenly, but over time this is how "exotic" becomes normal.
Nvidia link here: https://developer.nvidia.com/blog/inside-the-nvidia-rubin-platform-six-new-chips-one-ai-supercomputer/
Media literacy is becoming AI literacy, whether we like it or not
I mentioned the need for better AI education recently, and an example from Finland shows what that looks like in practice. A 10-year-old student, Ilo Lindgren, while practising how to spot fake news, admitted that telling the difference between fact and fiction is getting hard. Honestly, that's one of the most accurate summaries of the moment we're in. The lines are blurring for everyone, not just kids.
What Finland seems to be doing well is treating this as a core skill, not a one-off lesson. It's not only "don't believe everything you see." It's teaching students how influence works, how media works, and now how AI changes the texture of information itself.
Euronews link here: https://www.euronews.com/next/2026/01/05/after-decades-of-teaching-media-literacy-finland-equips-students-with-skills-to-spot-ai-de
AI can improve performance while quietly harming self-awareness
One of the most interesting studies I saw this week tested people on logic problems. People using AI did perform better—but their self-awareness went out the window.
Participants using AI improved their scores by about 3 points. But they overestimated their performance by 4 points. They thought they were doing way better than they actually were. Even more counterintuitive: AI literacy didn't fix it. People who were more familiar with AI were even more likely to be overconfident.
This is a big deal for AI for schools, because it points to a risk that isn't talked about enough. The danger isn't only that AI might be wrong. It's that AI can make us feel right. Confidence inflation is a learning problem. If students can't calibrate their own understanding, they can't improve it.
I use AI every day, and this is a great reminder to keep the internal BS detector switched on. Just because an answer sounds confident doesn't mean I'm suddenly smarter today.
Study link here: https://www.sciencedirect.com/science/article/pii/S0747563225002262
The skills gap: kids are using AI more, but fewer are learning how it works
The Guardian ran a piece on the need for better AI education in schools. The concern is simple: kids are using AI more than ever, but fewer are actually studying how it works. Experts worry this creates a divide where most people become passive users of a black box, unable to challenge automated decisions that may affect their health, finances, and opportunities.
That rings true to me. When people don't understand how AI generates outputs—probabilistic text based on patterns, not truth—they can end up following it in ways that are oddly misplaced. I don't think everyone needs to master matrix maths (even though that would be a fascinating world), but the soft skills of interpreting outputs are becoming essential.
The article calls AI literacy as important as reading and writing. That might be a stretch, but the direction is right. If we're heading toward a world shaped by algorithms, everyone needs enough foundation to notice when a result doesn't look right, and to know what to do next.
Guardian link here: https://www.theguardian.com/education/2026/jan/05/generation-ai-fears-of-social-divide-unless-all-children-learn-computing-skills
A glimpse of the "high resource" future: camps, avatars, and private school experimentation
The Hollywood Reporter had a surprisingly useful window into how LA's elite private schools are navigating AI. It's a very specific segment of education, but that's partly why it's interesting. In high-resource settings, "AI summer camps" and personalised chatbot avatars are already becoming normal. This is what early adoption looks like when budget and parent expectations are different.
I don't think this is the model most schools should copy. But it's a signal of where the market is heading, and what some families will start to expect elsewhere over time. It also highlights a tension that keeps coming up: when innovation moves fastest where resources are highest, inequity can widen unless we deliberately design for broad access.
Hollywood Reporter link here: https://www.hollywoodreporter.com/lifestyle/lifestyle-news/ai-hollywood-top-private-schools-1236462767/
The thread I'm pulling on from all of this
Putting all of this together, I keep coming back to a simple framing.
AI for schools has real potential to lift learning outcomes, especially where the learning crisis is most severe and traditional supports are limited. But at the same time, AI increases the need for strong interpretation skills: media literacy, critical thinking, calibration, and healthy scepticism.
And the pace is only going to accelerate. Hardware platforms like Rubin will reshape what's possible, which means schools will be offered more powerful tools sooner than their systems can comfortably absorb. That doesn't mean schools should slam the brakes. It means we need to pair adoption with education.
Not just "how to use the tool," but how to think with it, how to challenge it, and how to stay human while doing it.

