
European schools are taking their first concrete steps to integrate AI in line with the EU AI Act, with educators and experts urging a measured, strategic approach to compliance.
At a recent conference on AI in education, school leaders and policymakers explored how to align AI adoption with the incoming regulations.
With key provisions of the EU AI Act already in effect and full enforcement coming by August 2026, the pressure is on schools to ensure their use of AI is transparent, fair, and accountable. The law classifies AI tools by risk level, with those used to evaluate or monitor students subject to stricter oversight.
Matthew Wemyss, author of 'AI in Education: An EU AI Act Guide,' laid out a framework for compliance: assess current AI use, scrutinise the impact on students, and demand clear documentation from vendors.
Wemyss stressed that schools remain responsible as deployers, even when using third-party tools, and should appoint governance leads who understand both technical and ethical aspects.
Education consultant Philippa Wraithmell warned schools not to confuse action with strategy. She advocated starting small, prioritising staff confidence, and ensuring every tool aligns with learning goals, data safety, and teacher readiness.
Al Kingsley MBE emphasised the role of strong governance structures and parental transparency, urging school boards to improve their digital literacy to lead effectively.
The conference highlighted a unifying theme: meaningful AI integration in schools requires intentional leadership, community involvement, and long-term planning. With the right mindset, schools can use AI not just to automate, but to enhance learning outcomes responsibly.