This article is a beginner’s guide to artificial intelligence (AI). This technology, although it has been around for a while, evolves quickly and can be daunting if you’re not using it in your everyday life.
Don’t worry – you don’t need to know how AI works in detail, or use it yourself, to be able to hold leaders to account. This article will give you the basic information that you need to be able to ask questions about AI use with confidence.
What is AI?
AI isn’t new
Artificial intelligence (AI) is the use of computer systems to solve problems and make decisions. It’s already a part of everyday life – you may have come across it in the form of predictive texting, talking to Siri on your phone or using route-planning apps.
However, the technology is developing rapidly and throwing up many new challenges for schools.
What is generative AI?
Generative AI (sometimes referred to as ‘Gen-AI’) means artificial intelligence tools that generate new, ‘natural’-seeming content – so it seems like it was created by a person, not a computer. Tools include:
- Chatbots such as ChatGPT, Google Gemini and KeyGPT (available with our sister service, The Key Leaders), which generate text
- Text-to-image programs such as DALL-E and Midjourney, which create images
What can it do?
Generative AI tools can help with almost any task, and when used correctly can save a lot of time. For example, AI can:
- Write something new
- Analyse or summarise something
- Generate ideas
- Give feedback
It’s important to remember that although AI can help with many tasks, it isn’t flawless. It:
- Can’t think for itself or understand what it is saying
- Can’t predict the future
- Can make mistakes – anyone using AI will need to check its work
- Can be biased
Where does it get its knowledge from?
Generative AI models are trained with information from the companies that make them – so when they’re developing the technology, they will spend time ‘teaching’ the model lots of different information that it can pull from when it’s writing its answers.
This is why these tools can have biases – as humans we all have unconscious biases and these are passed on to the tool through the people training it.
Free vs paid-for tools
Free tools will often use the information that users give it to train the model further, so it’s important to be extremely careful when using these free versions. Find more information on how to do this in the 'make sure AI is being used safely' section below.
A paid-for version is more likely to have additional protections in place that mean that it doesn’t use the data it's given by users – often it will say at the bottom of the page that it isn’t using your chat to improve its data model. Some AI tools also have an option to opt out of your chats being used to train the models. Ask your school/trust leaders how they manage this in your setting, and how they make sure staff are aware of these differences.
Why do I need to know about AI?
AI is already being used in many different ways both in and out of schools. It’s important that schools develop strategies to make sure that it’s being used safely and responsibly, in order to stay on top of the evolving technology.
As a governor or trustee, you’re responsible for holding leaders to account for effective running of the school/trust, and having robust procedures in place for the use of AI is a part of that.
Remember: you don’t need to be an expert. Gaining a basic knowledge of how your school/trust is using AI and the important safety aspects to consider is a good starting point.
How AI is used in schools
AI is being used more and more in schools, and can be a great tool to help cut down on workload, handle communications better or understand things in greater depth.
Here are some examples of what AI is being used for in schools and trusts. These may not be authorised uses in all schools, so you’ll need to check your policies to see which ones your setting has chosen to use.
School/trust leaders:
- Summarise long documents
- Plan out pieces of work
- Write letters to stakeholders, or sense-check ones already written to check that the tone is right
- Save time with recruitment – i.e. by asking AI to write job adverts, interview questions and job descriptions for leaders to review
- Draft reports
- Research school improvement strategies
Staff members:
- Generate new ideas for lesson planning
- Create banks of comments for report writing
- Write quiz questions to test pupils’ knowledge
- Plan interventions
- Create assembly packs
- Generate images to use in lessons
Pupils:
- Get help to understand and plan out homework tasks
- Brainstorm ideas for creative writing
- Learn about new subjects
- Generate images
- Research topics
Please note: School leaders will need to monitor pupils' use of AI carefully to make sure that it’s being used safely and responsibly – see more on this below.
Governors/trustees:
- Summarise school visit notes
- Ask questions
- Reword a document or letter
- Summarise headteacher reports (you will still need to read the full report)
- Understand policies and procedures
Remember: Follow the rules below when using AI to make sure you're doing so safely, and never input any personal information into an AI tool.
Make sure AI is being used safely
Although AI can be a great tool in schools, it does come with its risks. Your school/trust leaders will be responsible for making sure that AI is used safely in your setting, and you’ll be responsible for monitoring their work on this.
The Department for Education (DfE) sets out standards for AI use in education that school leaders should follow.
Keep school data safe
Schools need to make sure that they’re complying with data protection regulations when using AI.
To do this, it’s important that school leaders understand how generative AI models use the data they’re given. Some AI tools use the data given to them to ‘train’ their AI model, so they’re taking and storing the information given to them.
Ask leaders which tools they’re using and how these tools use the data
Technology platforms and products (such as MIS and cloud storage) are increasingly using AI. However, many of these are designed to be used by companies or organisations like schools, and will comply with safe data handling requirements.
School/trust leaders should specify which AI tools they authorise for use in the school or trust, and how they know that these are safe. Many tools are available as personal or consumer products, so they may not meet the legal requirements for data handling in schools.
Never enter personal information into a tool you don't trust
You shouldn’t enter sensitive data into a generative artificial intelligence (AI) tool if you’re not sure how it will use or store the data.
Entering personal information into an AI tool that doesn't meet requirements for handling data safely could mean that your school/trust is in breach of data protection laws. It’s really important that school/trust leaders set out clear expectations for how everyone in school, including governors and trustees, can use AI safely.
Don't enter personal data into:
- Tools that aren't designed for sharing institutional data, such as free or consumer products
- Tools which don't align with your school or trust's data practice processes – for example, ones that allow your data to be used for AI training
What does this look like in practice?
- A governor is using a free AI tool to summarise complaints documents. They need to remove anything that could identify parents/carers, pupils or staff before entering the information into the tool
- A SENCO is using a free AI tool to sense-check an education health and care (EHC) plan. They need to remove anything that would make anyone in the plan identifiable, to keep their information safe
Use tools chosen and vetted by your school or trust
Your school or trust's AI policy or data protection policy should set out which tools can be used for particular use cases. For example, if your school/trust might have access to Gemini as part of an institutional Google package, or make use of KeyGPT as part of a Key subscription, and encourage staff and governors to use them for specific tasks. These services are designed for companies and institutions and handle your data safely.
If you aren't sure whether what you'd like to do with an AI tool is a safe use of personal data, check with your data protection officer (DPO). Read more about AI and data protection policies below.
What counts as ‘personal data’?
Prevent misuse with policies, procedures and monitoring
School/trust leaders need to consider how their policies and procedures set out expectations around use of AI, and how they’re monitoring this. Some documents that could set this out include:
Data protection policy
Leaders should update this policy to include information about the impact of AI on data protection, and what steps they’re taking to manage it.
Take a look at our model data protection policy – we’ve updated it with a new section on AI that school/trust leaders can adapt and use for your policies.
Digital strategy
Leaders should consider how they’re going to use AI and incorporate it into their wider digital strategy.
AI policy
Some schools might choose to have a separate AI policy that sets out the:
- Approved AI tools and uses by staff
- Use of AI by pupils
- Consequences for misuse
- Data protection implications
- Importance of correcting errors and bias
- Monitoring of AI usage – for both staff and pupils
Find out more about this policy and how to review it here – including examples of questions you can ask about AI.
Safeguarding policy
This might not seem like the usual place to find AI information, but it’s an important part of keeping pupils safe online.
Many apps that pupils have access to use AI – Snapchat and TikTok, for example.
Your policy may include a brief outline of how AI integrates with online safety, or your safeguarding link governor may ask questions about AI on their monitoring visits.
Homework policy
This is a good place for schools to set out expectations for pupils and parents. Leaders should consider how the school/trust approaches homework and whether they need to revise the homework policy, to take into account pupils’ access to generative AI tools.
The DfE suggests this in its policy paper on generative artificial intelligence in education.
Other policies on exams, coursework or plagiarism
The Joint Council for Qualifications classifies AI misuse – where a pupil submits AI-written work as their own – as malpractice. Leaders should check whether their policies reflect this, and revise them if needed.
Make sure your leaders keep you informed
This technology is evolving quickly, and schools/trusts need to react to any updates. Alongside making updates to policies, leaders need to make sure that they have a strategy to tackle anything new as and when it arises. This could include:
- Informing staff of new updates and what this means for their practice
- Communicating with parents/carers when something changes
- Talking about changes to AI in assemblies or lessons
These changes may need to happen more quickly than the policy approval process allows. Your leaders should keep you in the loop, so that you know what changes have happened – and why – when the relevant policy gets to you for approval.