Nord Anglia Education
WRITTEN BY
Nord Anglia
19 February, 2026

Teaching children to think with AI. And learn the skills AI can't replace.

Teaching children to think with AI. And learn the skills AI can't replace. - Teaching children to think with AI And learn the skills AI cant replace

AI is already shaping how children learn, search, and make sense of the world. It’s in the apps children use, the answers they’re given, and the assumptions they make. 

Professor Rose Luckin is Founder and CEO of Educate Ventures Research and an internationally recognised expert on AI in education. Rose is working with Nord Anglia Education’s schools worldwide to ensure students learn to use AI tools in an age-appropriate and carefully guided way.

Here, she explains to parents what AI literacy really means, and how they can help their children stay curious, sceptical, and in control of their own thinking.

 

What does “AI literacy” mean at Nord Anglia for a seven-year-old vs a 17-year-old?

For a seven-year-old, AI literacy isn’t about using generative AI. It’s about building foundations. Children this age are already encountering AI through voice assistants, recommendation algorithms, and smart toys. They need to understand that these systems learn from patterns in data, that they’re made by people who make choices, and that computers don’t “know” things the way humans do.

Most importantly, we’re developing the questioning habits and critical thinking that will serve students later. Can you trust everything you’re told? How do you check? What makes a good source? These fundamentals of human intelligence – curiosity, scepticism, verification – are the bedrock of AI literacy, long before children touch a chatbot.

A 17-year-old, by contrast, is probably already using generative AI. They need sophisticated understanding: how these systems produce confident-sounding but incorrect responses, why fluency isn’t accuracy, how bias enters through training data, and crucially, awareness of their own thinking. Are they using AI to enhance their capabilities, or outsourcing the cognitive work that builds their intelligence? It’s this self-awareness that separates effective use from dependency.

 

How important is it for schools to teach the skills that AI can’t replace? And what does Nord Anglia’s approach tell us?

It’s essential. Schools and families need to work together to ensure young people develop the skills AI can’t replicate. Judgement, ethical reasoning, creativity, resilience, and metacognitive awareness.

At Nord Anglia, we already know how to do this. Our two-year study into metacognition – teaching children how to think about their thinking – shows that when students explicitly learn how to plan, monitor, and evaluate their thinking, they develop stronger problem-solving, greater independence, better self-regulation, and more confidence in tackling complex tasks.

These are important because they’re precisely the skills required to use AI well. 

Someone who has developed these skills doesn’t accept the first answer they see. They question it. They reflect on whether it aligns with their purpose. They notice when they’re relying too heavily on AI. They recognise bias, gaps, and errors because they’re actively engaged in their own thinking.

In other words, the skills that make a student academically successful are the same skills that allow them to leverage AI safely, effectively, and creatively.

This is why partnership matters. Nord Anglia’s schools teach these habits explicitly and systematically. Parents can reinforce them at home through conversation and modelling thoughtful use. Together, we can ensure AI becomes a catalyst for deeper intelligence and more powerful learning.

 

Teaching children to think with AI. And learn the skills AI can't replace. - Teaching children to think with AI And learn the skills AI cant replace

 

What is the simplest way to explain AI hallucinations to parents and children?

AI doesn’t understand a single word it produces. 

It recognises patterns in text; it has no grounding in reality, no lived experience. When it doesn’t know something, it doesn’t say “I do not know”. Instead, it generates plausible-sounding text that may be completely false.

For older children using these tools, the message is simple: confidence isn’t competence. AI sounds authoritative even when it’s wrong.

 

What are the most common challenges children face when using AI?

Three patterns concern me. First, accepting fabricated facts, particularly dates, statistics, and quotations. Second, mistaking AI’s confident tone for expertise. Third, and most troubling, gradually relying on AI for thinking that children should still practise themselves.

This cognitive offloading is where we need to be most careful. Recent research shows test scores dropping significantly when students rely on AI, and greater trust in generative AI correlating with reduced critical thinking. 

The risk isn’t wrong answers. It’s that young people stop developing the thinking muscles they need.

 

How should we teach children to keep their voice and originality when using AI?

By making it clear that AI is a thinking partner. It’s not an answer-generating machine.

AI can help brainstorm or challenge ideas, but original thought must come from the student. 

It’s why at Nord Anglia’s schools we’re teaching students to pause before accepting outputs, to ask whether the AI’s answer reflects what they actually think.

 

What is a short checklist families can use at home?

Parents don’t need to be technical experts. What matters most is encouraging conversation, curiosity, and healthy scepticism. Here are four crucial questions to ask when your child uses AI for learning:

  1. “What do I need?” Be clear about your purpose before you start. If you don’t know what you want, you can’t judge whether AI has delivered it.

  2. “Where did this come from?” AI doesn’t cite sources reliably. Find the original.

  3. “What is missing?” AI gives an answer, not the full picture.

  4. “How do I check?” Never submit AI-generated content without verification.

 

Read more about how we're helping students develop the skills AI can't replace.

 

About the author

Rose Luckin is an internationally respected academic and influential voice on the future of education and technology, particularly AI. With over 30 years of experience, she is a recognised expert on AI in education, serving as an advisor to policymakers, governments, and industry globally.

Rose is Professor Emerita at University College London and Founder and CEO of Educate Ventures Research Limited (EVR), a company that provides thought leadership, training, and consultancy to the education sector to help them leverage AI ethically and effectively.

At Nord Anglia Education, Rose is helping students and educators navigate AI thoughtfully and critically, with a focus on ensuring school communities use AI to become smarter and more capable thinkers, not to outsource the very learning that builds their intelligence.

Rose also serves as an advisor to Cambridge University Press and Assessment and is co-founder of the Institute for Ethical AI in Education. She is President of The Self-Managed Learning Centre in Brighton and sits on a range of advisory boards within the education and training sector.

Rose was honoured with the Bett Outstanding Achievement Award in 2025, named a Leading Woman in AI EDU at the ASU-GSV AIR Show in 2024, and awarded the 2023 ISTE Impact Award, becoming the first person outside North America to receive their top honour. She holds a PhD in Cognitive and Computing Sciences and a First Class Bachelor’s degree in AI and Computer Science.