The Most Human Schools Will Use AI the Most
- Michael Stone
- Aug 12
- 8 min read
Updated: Aug 13

As artificial intelligence becomes increasingly integrated into educational environments, the fundamental purpose of schooling is undergoing a seismic shift. No longer confined to the industrial-era model of information transmission, schools have the potential to become incubators of empathy, creativity, and ethical discernment. This piece provocatively challenges the traditional structures and argues for a future in which AI assumes mastery of content delivery and personalization, thereby liberating human educators to focus on cultivating the human condition. Rather than seeing AI as a threat to teaching, this paper positions AI as the greatest opportunity in generations to personalize and humanize learning at scale—if we are brave enough to let it.
AI: The Teacher Liberator?
The factory model of education—defined by age-based grade levels, bell schedules, and rigid, prescriptive curricula—was never designed for human flourishing; it was designed for economic efficiency. Today, with the advent of generative AI, we are living a paradigmatic rupture. AI can now arguably curate, develop, scaffold, and personalize content delivery better and faster than most educators, and it’s only going to get better. That is not a threat to teachers. It is liberation.
For too long, teachers have been expected to serve as the primary vehicle for delivering content, even though we’ve known for decades that students forget most of what they “learn” in this mode. The real scandal isn’t that AI can now generate a personalized lesson on photosynthesis in seconds—it’s that we ever thought human ingenuity should be wasted on repeating that same lesson four times a day for 30 years.
The Rise of the AI Partner
In traditional schools, content has been king. We staff, schedule, create attendance and discipline policies, purchase curriculum, and test (and test and test) based almost solely through the lens of content mastery. But in AI-augmented environments, content becomes the servant; on-demand, endlessly adaptable, and infinitely scalable. AI can now facilitate personalized experiences at a level never before possible. It can track a student's micro-progress, identify misconceptions in real time, simulate historical events, differentiate content and learning experiences, generate real-time formative feedback, and connect a third grader in Tennessee with a climate scientist in Nairobi.
This isn't theoretical. Students and teachers are already using AI tools like ChatGPT, Grammarly, and Khanmigo to write essays, plan science projects, and translate and scaffold assignments. Meanwhile, school leaders still debate whether AI is cheating or a tool. The kids aren’t waiting. Neither should we.
But here’s the caution: just because AI can make learning smoother, doesn’t mean it always should. Cognitive science tells us that fluency (when information feels easy to process) creates the illusion of mastery, but actual learning often comes from moments of struggle. In one well-known study, college students who sat through a polished “super lecturer” physics class felt like they learned more, but those in a more frustrating, problem-based session actually scored higher on assessments (Deslauriers et al., 2019). The difference? The latter group engaged in what researchers call “desirable difficulties.” Desirable Difficulties are intentionally effortful learning conditions that feel harder in the moment but produce better long-term retention and transfer.
If we swing the pendulum too far and let AI remove all of the friction from learning, we risk creating students who feel confident but can’t transfer or apply their knowledge. The goal isn’t effortless mastery; it’s designing AI experiences that motivate the hard, productive work our brains need to truly learn.
Shifting the Role of Teachers
Once AI assumes the role of the content expert, what remains for the human teacher? Everything that actually matters.
Empathy. Curiosity. Morality. Courage. Collaboration. Critical consciousness. These are not byproducts of good teaching. They are the point. They should have always been the point.
Imagine if teachers were trained not as content deliverers but as mentors in human development. Their job would be to cultivate wisdom, not just transfer knowledge. To help students wrestle with questions like: What kind of person do I want to be? What is my obligation to others? How do I navigate ambiguity? How can I engage in productive conflict resolution? What are my unique interests and aptitudes and how can I leverage their overlap to carve out a meaningful and fulfilling career? One of the teachers’ most vital new roles will be metacognitive coaching—helping students read their own learning signals, normalize struggle, and plan next moves. Then, equipping them to engage in productive struggle as they navigate dynamic challenges.
This is not a soft-skills pivot. It is a re-centering of school around the only thing AI can never replicate: the irreducible, complex experience of being human.
The implications are significant and widespread, from content to class sizes and beyond.
Removing Time as the Constant
We know grouping students by age is pedagogically bankrupt. It assumes linear, uniform development in a world where personalization is not only possible but imperative. Cognitive science has shown for decades that children’s learning needs shift dramatically with developmental stages; not just in content readiness, but in executive function, self-regulation, and social-emotional capacity (Best & Miller, 2010). A 9-year-old’s working memory and ability to plan are fundamentally different from a 16-year-old’s (Gathercole & Alloway, 2008). But the development is not always age dependent. Productive struggle for one might be too frustrating or too trivial for the other. It should be personalized and developmental. AI can finally make it possible to match learning experiences not only to what a student knows, but also to how they are ready to think—scaffolding complexity, autonomy, and support by developmental stage while still allowing every learner to progress at their own pace.
Generative AI has the potential to empower us to discard archaic expectations of every student mastering content by an arbitrary date when the unit ends. We can now embrace a mentality that the question should be: “How well have you learned it?” Instead of, “I hope you learned it by last Friday, because today we start unit three and never look back.”
Perhaps we consider keeping age-based grouping for social and developmental reasons. Fine. Let’s preserve it for lunch, sports, extracurriculars, and advisory. But academically? It’s time to rethink everything.
While we need to consider the impact AI can have in education and student development, the implications go beyond how students engage with content. This line of thinking opens a vast array of changes we might pursue.
What if Class Size Doesn’t Matter the Way We Think It Does
AI collapses the scale problem in education. With the right tools, a single teacher could oversee the academic progress of two or three times the normal number of students—if their job is no longer to lecture, but to intervene strategically as a mentor, provocateur, and guide. Conversely, if we want teachers to be relational anchors—coaches of human development—we might need to reduce class sizes dramatically.
Either way, our assumptions about class size are built on outdated constraints. Stop designing for adult convenience and legacy budgets; design for what helps students learn deeply. We should have been doing this years ago, but the transformative power and rapid market saturation of generative AI may provide the unavoidable catalyst to actually make changes reformers have pushed for over the last few decades.
Redesigning the Day
Consider a 12-year-old student in a reimagined school. She begins her day with an AI-powered reflection tool that helps her set goals, track her wellbeing, and surface recent achievements or challenges. Her schedule is not divided by bells but by learning sprints—some in collaboration with others, some in deep independent focus, some guided by an AI mentor, and others guided by a human.
During one learning sprint, she works on a real-world climate problem, accessing datasets and running simulations through her AI assistant. She uses AI to research intersections between environmental challenges and historical injustice, but her analysis and moral reasoning are enhanced in Socratic dialogue with her teacher and a small group of students engaged in the same sprint. She ends the day presenting her findings to a multi-age peer group and revising her project portfolio and earning micro-credentials in a tool like FabFolio. These aren’t just tokens for completing work. They draw on research-based assessment practices like retrieval practice (which strengthen long-term memory through active recall (Roediger & Butler, 2011)) and spaced repetition (which improves retention when learning is distributed over time (Carpenter, 2022)).
AI can harness these techniques in the background while simultaneously tracking not only her mastery of academic content, but also her growth in key human competencies like empathy, agency, collaboration, persistence, and self-regulation, through process indicators embedded in her daily work. Human-scored rubrics anchor these indicators while AI assists in aggregation and trend-detection, not final judgement. By combining traditional evidence of learning with these richer, human-centered metrics, progress becomes a multidimensional portrait of development rather than a single score on a test.
This isn’t science fiction. Every piece of technology needed to see this in action already exists. What’s missing is the courage to dismantle what no longer serves us, and the critical mass to push beyond conversation and to challenge the status quo with action.
The Illusion of Competence
In our enthusiasm for AI, we risk forgetting what schools actually are: the most consistent public space where children learn how to be human together. AI may write essays, but it cannot model grace under pressure. It may provide instant feedback, but it cannot sit with a grieving child after the loss of a loved one.
There’s another, quieter danger. When AI makes learning feel too smooth, it risks creating an illusion of competence. Cognitive science shows that our brains often mistake fluency for true mastery. That’s why students in the polished, passive physics lecture felt like they’d learned more yet retained less than their peers who wrestled through messy, active problem-solving. If we design AI to remove all friction (i.e. write the essays for me), we may graduate confident learners who crumble when asked to apply knowledge in complex, real-world contexts.
In other words, if we treat AI as a replacement for teachers, we will fail spectacularly. If we treat it as a liberator of teacher humanity, we may just build the most transformative system of learning the world has ever seen.
Consider these actions:
Design for friction: build in retrieval, spacing, and generative explanations.
Coach metacognition: make learners predict, explain, and self‑assess.
Measure process, not just product: track revision, persistence, collaboration quality
The Future Isn’t Optional
We stand at a multi-pronged fork in the road. One keeps us on our current path—a system that overvalues content mastery and sets arbitrary content gates in a world where content has been completely commoditized. Another path, over-reliance on AI, leads to algorithmically efficient learning devoid of soul—personalized but isolating, productive but shallow—potentially accelerating learning but concurrently dehumanizing the very students it’s intended to serve. The other leads to a renaissance of humanity in education, where AI powers personalization and scale, and teachers devote themselves to what only humans can do.
We don’t need better schools. We need different ones. And we don’t need better teachers. We need freer ones—freed from the tyranny of content delivery, empowered to nurture wisdom and wonder, equipped to facilitate deep learning as a consequence of rich experience (as my friend Gary Stager has advocated for decades).
The future is not coming. It’s here. The only question left is: Will we have the audacity to meet it? Because the real risk isn’t that AI will replace teachers. It’s that we’ll use it to create the perfect illusion of competence, turning out students who feel educated, but can’t navigate the messy, unpredictable challenges of the real world. The schools that thrive in this new era will be the ones that design AI-powered learning to keep the struggle in the story—where confidence comes not from how easy it felt, but from the proof that you wrestled with hard things and grew stronger for it.
References
Best, J. R., & Miller, P. H. (2010). A developmental perspective on executive function. Child Development, 81(6), 1641–1660. https://doi.org/10.1111/j.1467-8624.2010.01499.x
Carpenter, S. K. (2022). The science of effective learning with spacing and retrieval practice. Nature Reviews Psychology, 1(1), 89–100. https://doi.org/10.1038/s44159-022-00013-1
Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., & Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proceedings of the National Academy of Sciences, 116(39), 19251–19257.
Gathercole, S. E., & Alloway, T. P. (2008). Working memory and learning: A practical guide for teachers. SAGE Publications Ltd. https://doi.org/10.4135/9781446213665
Roediger, H. L., III, & Butler, A. C. (2011). The critical role of retrieval practice in long-term retention. Trends in Cognitive Sciences, 15(1), 20–27. https://doi.org/10.1016/j.tics.2010.09.003


Comments