Reflections From FOLSEA 2025

At the FOLSEA conference this weekend, the enthusiasm for AI in education was unmistakable. Most discussions focused on the tools themselves and the clever ways teachers are starting to use them. What stood out, though, was how little attention was given to the deeper implications for learning. There is a growing gap between using AI and understanding what it replaces in the learning process.

Children build understanding through repetition, correction, and practice. When they read, write, solve problems, or build ideas step by step, they strengthen the neural pathways that support memory and reasoning. This synaptic strengthening is not a luxury; it is the biological process that makes learning stick. When AI produces explanations, drafts, or solutions on their behalf, students may skip the experiences that shape durable knowledge. It can leave learners looking confident in the moment but less secure beneath the surface.

This is where concerns arise. If students rely on generative tools too early, tasks feel easier than they should, and the habit of effort becomes less familiar. Self-regulation, persistence, and reading stamina are still developing in younger learners, and early exposure to AI can encourage dependency before these habits mature. It is similar to how students who lack physical activity or fine motor practice miss certain developmental steps. The risk with AI is cognitive rather than physical, but the principle is the same: skipping the practice weakens the foundation.

Sethi teaching Non-AI skills at FOLSEA to encourage resilience in our younger learners

There is also the question of identity. Writing and expressing ideas help students develop their own voice. If too much of that process is handed off to AI, personal style forms more slowly and with less ownership.

Interestingly, the conference itself highlighted both sides of this tension. In my own session on using Google Gemini, I focused on how teachers can use AI to create high-quality materials rather than asking students to use AI directly. Tools like Google Videos, improved imagery, well-structured explanations, and engaging activities can help students study more effectively while still doing the thinking themselves. The intention was to save teachers time and give them more flexibility to personalise learning, not to outsource cognitive effort to the technology.

That distinction felt important. When teachers use AI to enhance their planning, students benefit from richer, clearer, more accessible materials without losing the experience of building understanding through practice. It keeps AI as a professional tool rather than an early shortcut for learners.

As AI becomes more common in classrooms, the key issue is not whether we should use it, but how we can use it without weakening the developmental processes that underpin strong learning. Thoughtful sequencing, human-led modelling, and continued emphasis on core skill-building ensure that AI supports education rather than reshaping it too quickly.

Author