GDPR has been in force since 2018 and most teachers are still unsure what it actually requires of them when it comes to AI tools. The official guidance tends to be written for data protection officers rather than classroom teachers, and the advice that circulates in staffrooms is often a mixture of things that are technically correct, things that are overcautious and things that are simply not accurate.
This article is an attempt at a clear, honest, non-legal account of what the rules actually say and what they mean in practice for a teacher who wants to use AI sensibly.
What GDPR actually covers
GDPR (the UK version, since Brexit, is called UK GDPR but the substance is almost identical) governs how personal data is collected, stored, processed and shared. Personal data means any information that can identify a living individual. That includes names, photographs, email addresses, and any other information that, combined with other data, could identify someone.
For teachers, the relevant personal data is information about their pupils. Names, ages, year groups, attainment data, SEND information, behaviour records, and any written observations or notes about individual children all count as personal data and are therefore subject to GDPR protections.
The key principle that matters here is data minimisation: you should only share personal data when it is necessary to do so, and only with the systems or people who need it to fulfil a legitimate purpose.
What “training data” means and why it matters
When you use a general-purpose AI tool like ChatGPT, there is a question about whether the information you type into it is retained and used to improve the model. OpenAI and other providers have different policies on this, and those policies can change.
If you type a child's name, year group, attainment level and a set of observations into a chat interface, you are sharing personal data with that service. Whether or not it is “used for training” in the technical sense, it has still left your school's systems and entered a third-party platform. That is the part that matters most from a compliance standpoint.
The ICO (the UK's data protection regulator) has published guidance on AI and personal data that is worth reading if you want the full picture. The short version is that organisations need to understand what they are sharing with AI tools, on what legal basis, and where that data goes.
Using ChatGPT for school reports: is it okay?
This is the question that comes up most often in schools right now. The honest answer is: it depends on what you type into it.
If you ask ChatGPT to write a generic report for a fictional child, there is no personal data involved and there is no GDPR concern. If you type “write a report for Oliver Thompson in Year 4 who is struggling with reading and has an EHCP for dyslexia”, you have just sent personal data about a real child to a third-party platform without a data processing agreement, without the knowledge of the child's family, and without your school having approved that tool for that purpose.
Most schools have not approved ChatGPT as a tool for processing pupil data, and under UK GDPR, schools are required to ensure that any third-party data processors have appropriate safeguards in place. That is a higher bar than simply trusting that a company's privacy policy is probably fine.
Questions to ask about any AI tool for school use
Before using any AI tool for anything involving pupil data, it is worth asking the following questions. Some of these you can answer from the tool's privacy policy. Others you may need to ask directly.
- Does this tool store any data I send to it? If yes, for how long and where?
- Is any of my data used to train or improve the AI model? This should be explicitly stated in the privacy policy or terms of service.
- Where is data stored? UK GDPR has specific requirements about data transfers outside the UK and EU.
- Is there a Data Processing Agreement available? For any tool that processes personal data on behalf of your school, your school (as data controller) should have a DPA in place with the provider.
- Has your school approved this tool for use with pupil data? Even if the tool itself is compliant, your school needs to have gone through its own assessment process.
What Staffroom does differently
Staffroom is built specifically for school use, which means these questions have been central to how the product works rather than an afterthought.
Before any AI processing takes place, pupil names are automatically hashed and replaced with anonymous identifiers. The name “Emily” becomes something like “Pupil A” before it ever reaches an AI model. This means that even if data were somehow intercepted or retained, there would be no personally identifiable information in it.
Your data is stored on secure UK infrastructure, is never shared with third parties for any purpose, and is never used to train any AI model. The tool is designed to help you write better reports while keeping your school fully in control of its data.
Practical steps for your school
If your school wants to use AI tools responsibly for tasks involving pupil data, here are the practical steps worth taking.
- Speak to your Data Protection Officer or data lead before introducing any new AI tool into your workflow. They will need to carry out a Data Protection Impact Assessment for any tool that processes personal data at scale.
- Check whether the tool has a GDPR-compliant privacy policy and whether it is willing to sign a Data Processing Agreement. Reputable tools built for school use will offer this as standard.
- Establish a clear school-wide policy on which AI tools are approved for use with pupil data and which are not. Without this, individual teachers are left to make judgment calls that should really be institutional decisions.
- When in doubt, anonymise. If a tool does not have privacy protections built in, you can add your own by removing names and any other identifying information before you type anything in.
The ICO's position
The Information Commissioner's Office has been clear that AI use in organisations needs to be grounded in the same data protection principles as everything else: lawfulness, fairness, transparency, purpose limitation, data minimisation, accuracy, storage limitation and integrity. It has also signalled that it will be paying attention to how organisations are using generative AI, particularly where children's data is involved.
The full ICO guidance on generative AI is available on their website and is worth a read if you are responsible for data protection policy in your school. The short version: understand what you are sharing, why, and what the tool does with it.
Worried about using AI tools with pupil data? Staffroom anonymises pupil names before any AI processing, stores data on UK servers, and never uses your information to train models. See our full approach on the security page.