Over recent years, there has generally been significant improvement in how educators implement educational (Ed) technology (Tech). This is in part thanks to the work of Dr. Fiona Aubrey-Smith, author and researcher, who together with Peter Twining, wrote about the importance of pedagogical (Ped) and purposeful use of technology (Tech), coining the term PedTech.
When it comes to the recent implementation of AI in education, due to a lack of knowledge, expertise, leadership and governance of AI in schools, we all too often lose the sense of pedagogy and purpose with the promise that AI will solve the workload, workflow and retention issues we face in education currently. AI in the education market is currently driven by companies selling software and/or a service, rather than being driven by purpose or pedagogy.

It’s no wonder companies are trying to grab their own piece of the action, The Artificial Intelligence in Education Market Report for 2024-2029, predicts that the AI in Education market will be worth $43.47 billion by 2029, compared to $5.47 billion in 2024, hence the overwhelming nature of AI in education right now. Whilst a wealth of AI solutions sounds great for education, unfortunately, a lack of UK government legislation (for both AI developers and for AI in schools) means AI threats (ethics, bias, safety and security) and AI compromise (quality, expertise, safety and security) are a real possibility. School leaders must do their homework on each company they endorse and each tool they implement.
The recently published AI Playbook for UK Government guidance (not legislation) goes some way to informing schools and developers of the ethical and technical requirements for use of AI, but schools immediately require further advice, guidance and direction from educational experts.
When implementing AI we must not only maximise the impact it can have on teaching, learning, inclusivity and an organisation as a whole, but we must also mitigate the risks it poses on ethics, bias, safety and security (we can save this for another article)… for the purpose of this article, schools must prioritise human intelligence alongside artificial intelligence.
I simplify human intelligence into five categories; knowledge, persistence, curiosity, imagination and discipline, and without instilling these qualities in our staff and students we add to the risks that AI poses.
Human Intelligence (HI) and Artificial Intelligence (AI)
- Knowledge: Theoretical and practical understanding of AI. Leaders must ensure that staff and students have an understanding of, and training on, both the technical and ethical requirements when using AI. This includes but is not limited to prompt design, data privacy, hallucinations, bias, etc. Leaders can support safe and secure use of AI by leading informative professional development and creating an AI inventory (bespoke to your context) of appropriate and purposeful AI tools. In order to develop said inventory and training you might have to consult with AI and education experts.
- Persistence*: Resilience to adversity when using AI. Rightly so, when implementing technology and AI, you must have rigor in your policy and process. Whilst this endorses security and safety, it can elongate the implementation process and create a barrier. You must be resilient when implementing, using and developing AI. All staff must overcome AI uncertainty; the knowledge section can help with this, but if there is still some AI-anxiety, there are many AI early adopters and education experts out there that might be able to settle your uncertainty.
- Curiosity: Exploration and investigation of AI. We must be curious about the innovation and ideas that are out there. AI is on a steep incline of advancement and ingenuity; whilst the wealth of AI solutions is daunting at first you will soon sort the wheat from the chaff and stumble upon a tool that fits your requirements and context. You will soon get a good grasp of what AI tools are out there and the key to it is little and often. Dive into LinkedIn, X or blogs occasionally to see what tools are being developed and implemented elsewhere.
- Imagination*: Innovation through the use of AI. Beyond being curious about what tools are already available, be imaginative with how AI and its advancements might enhance education or solve that long-term problem you and your school face. The bounds of AI are often your imagination and your imagination and curiosity might lead to innovation. But also remember that innovation (a new idea for you) might have already been tried and tested by someone else.
- Discipline*: Critical consideration for the purpose and process of AI. We must be critical about the purpose of AI, this is where establishing a school/MAT vision for AI (and/or EdTech) is important to justify the procurement and implementation for AI. Sometimes it is okay to say no to AI or to use it in balance with more traditional methods. Your decision to or to not use AI might be down to cost vs impact; the cost is not exclusive to finance, you also need to consider the readiness of staff to implement something new, the capacity of staff to undertake training, or the data which you exchange. One key consideration is that everything comes at a cost… if an AI/EdTech tool is free, what data are you giving away?
*Some of the aforementioned qualities are inspired by Bill Lucas’s A field guide to assessing creative thinking in schools (2022) and the five-dimensional model of creativity created by the Centre for Real-World Learning (CRL) and supported by Creativity, Culture and Education.
Importantly, and also excitingly, as part of Big Education’s Rethinking School project, and in collaboration with the Good Future Foundation and other educators, we are developing an AI Framework that is designed to be a catalyst for consideration and will support Al within education. It will be accessible to all educationalists, however, it should be used by school and sector leaders to effectively implement Al. The Human section that you have had insight to above, is one of eight sections for consideration and no one section works in isolation.
Should you want a sneak peek at the draft framework please contact [email protected]. We would welcome future collaboration and feedback as part of an upcoming pilot.
The title of this article, ‘We don’t need AI to educate, but we need to be educated on AI’, is a provocation and personal assessment of the factors that schools should consider prior to implementation and throughout the use of AI. The title paraphrases an older quote from David Geurin, who in 2017 said ‘Classrooms don’t need tech geeks who can teach. We need teaching geeks who can use tech’, referring to technology in general as opposed to modern AI advancements.