This bespoke newsletter is in addition to the monthly AI in Education newsletter to keep you updated on progress of DfE & DSIT AI Education Content Store Project we are involved in. ![]() AI Education Content Store
MONTHLY NEWSLETTER #2 - FEBRUARY 2025Hackathon Edition This newsletter aims to keep you updated on the development of the AI Education Content Store, commissioned by the Department for Education (DfE) and the Department for Science, Innovation and Technology (DSIT).In this edition: What is the AI Education Content Store? Can you help us: We’d love to hear from interested schools and teachers The Project in development: A focus on the first Hackathon AI and Safety: How will we ensure the content store is safe to use? FAQs: ‘What type of student work are you looking to ingest into the content store, and what will it be used for?’ Meet the Team: Richard Beeson, programme manager at AI in Education & Bourne Education Trust Who are we working with? Preparing for the first teacher reference-group session Coming Up: Dates for the diary What is the AI Education Content Store?The AI Education Content Store will be a repository of AI-optimised educational material, developed in partnership with the sector and key education organisations. It will include curated content such as programmes of study, lesson plans and anonymised student work. The material in the content store will be gathered collaboratively with the sector – we will be guided by teachers, students and parents, as well as by educational leaders. Once we have amassed a centralised repository of AI-optimised content, we will offer access to edtech providers, who will use it to develop AI products for schools and colleges. Can you help us?We are starting to work with schools to gather content for the AI Education Content Store, and we’re very keen for all interested schools to be able to participate. If you’d like to be involved, please email: content-store-programme@faculty.ai ![]() The Project in Development:A focus on the first Hackathon The two-day hackathon using the AI Education Content Store took place at the end of January. It was attended by edtech developers, the Department for Education AI for Education tools’ competition winners and awarding organisations. As we will eventually be adding content to the store from a wide range of contributors, we wanted to make sure that content providers were represented, too. We are already meeting with the teacher reference group to collate teacher resources (see below). And representatives from Oak National Academy, which will also be providing content, were present at the hackathon. Introduction to the AI Education Content Store The day began with introductions from Tom Nixon, Managing Director of Faculty AI’s Applied AI division, and from Fay Skevington, Head of AI Alignment for Education at the Department for Education. Participants were given a general and technical introduction to the content store, talking them through how it works and what content is currently available: DfE curricular materials, additional DfE guidance and additional guidance from the Standards and Testing Agency. They were then organised into teams, with members of different organisations deliberately placed together. The aim was to encourage them to work beyond their comfort zones, pushing the boundaries of their imagination and experience so far. Each team also included a mix of skills: software engineers with coding knowledge, as well as chief technology officers and chief executives from edtech companies, whose strengths lay in ideation, strategy and design. Tackling problems Every team was given a problem statement to work on, encouraging them to think about a product that could put the AI Education Content Store to new and innovative use. Problem statements included:
Using the material provided by the content store, teams designed a product to tackle their problem. They built it, tested it and iterated it, depending on the results of the tests. At the end of the second day, representatives from each team delivered a presentation to all participants, explaining what they had built, what challenges they had faced and what they had learnt during the process. They also discussed the ways in which the content store improved the quality and usefulness of their product, compared with other LLMs, such as ChatGPT or Claude. Making connections Two MPs – Dame Emily Thornberry, whose constituency houses the Faculty AI offices, and Stephen Morgan, Minister for Early Education – visited the hackathon to observe its progress. They met representatives from Faculty AI, received an overview of the project and listened as the teams discussed different opportunities for the content store. By the end of the two-day period, there was a tangible sense of excitement among participants about the potential opportunities created by the content store. In particular, edtech companies were pleased to have a timeline for content ingestion, allowing them to plan ahead for product development. Participants also appreciated the networking opportunity offered by the hackathon – the chance to meet people with similar areas of interest and share ideas and information. To read our blog following the event, please click here. ![]() ![]() ![]() AI & Safety:How will we ensure the content store is safe to use?Safety is a core part of the AI Education Content Store programme. We are exploring how the content store can support edtech companies to create products that are not only aligned with best teaching and learning practice, but also safe enough to be used in education. A team dedicated to AI safety in edtech spent four weeks exploring the harm that students could be exposed to through use of AI. We collated a list of 18 possible harms, ranging from bullying to depiction of violence and encouragement of suicide or self-harm. We’re also looking at how young people might develop an emotional relationship with or dependency on AI. We examined potential scenarios where harm might be caused maliciously – for example, using AI to bully someone or to encourage them to engage in risky behaviour. And we examined non-malicious scenarios – for example, if a time-poor chemistry teacher didn’t conduct the necessary quality checks on AI-generated instructions, potentially creating risk in the classroom. For each potential harm, we examined how it might manifest itself in edtech, what prompts it would require, whether it would be visual or text-based and how the tool outputs would have to be disseminated in order to impact negatively on users And we have conducted market analysis of around 100 edtech products, so as to understand where the biggest risks lie. Following a prioritisation exercise, we’re now looking in greater detail at the six harms that we consider present the biggest, most immediate threat in an educational context. These are: self-administration of material; depiction of real or realistic violence or serious injury; bullying; inciting hatred; highly risky stunts; and radicalisation or extremism. Ultimately, we’ll be providing technical guidance to edtech companies to help them use data-science techniques to detect potentially harmful outputs and to filter these out to protect users. An edtech tool could then flag a response as harmful and automatically alter its response to avoid the harm. Other potentially problematic output – such as biased primary sources for a World War Two history lesson – could be flagged as “needs caution”.
![]() FAQs:‘What type of student work are you looking to ingest into the content store, and what will it be used for?’ Answer: Ingesting national-curriculum guidance and teacher resources into the AI Education Content Store is only part of the story. Adding student work – and, importantly, teacher feedback on that work – into the store will support benchmarking of tools and responses. When edtech companies use the content store to create marking and assessment tools, this will allow them to refer to the appropriate standard of work for each year-group setting. Accessing work taught by teachers, completed by students and then marked and quality-assured by teachers will also give edtech companies a fully rounded picture of appropriate levels at each stage of education. Ultimately, this will lead to more accurate tools, providing better outputs for students and teachers. ![]() Meet the Team:What is your role? I am the programme manager at AI in Education and Bourne Education Trust, and I’m a member of the core delivery team for the AI Education Content Store. I coordinate the multi-academy trust leader reference group and the teacher reference group. My aim is to ensure a clear link between practitioners in the education sector and what we as a project team need to be able to deliver for the content store. This enables the team to test hypotheses with an engaged group of end users. Because of my background in education, I’m also able advise the content store team directly on teaching pedagogy and the range of teaching resources teachers might need. For example, I can tell them which national reference materials teachers will refer to for long-term and short-term planning. Speaking to multi-academy trust leaders and teachers gives us a clear understanding of how teachers might use the resources that edtech providers may produce using the content store. What’s your background? After a BEd degree, I went straight into teaching physics. I have remained in education for the past 27 years. I was in senior leadership for 16 years and then spent five years as the head of a secondary school. I’ve always been interested in how technology can enhance educational opportunities. As soon as GenAI became more publicly available, with the launch of ChatGPT in November 2022, I was hugely excited by the potential it had for education. By September 2023, I’d moved out of headship and had taken on a role as digital and AI lead for a MAT, as well as working for AI in Education. Then, in September 2024, I moved full time to my current role. What do you enjoy about working on the AI Education Content Store? I enjoy collaborating with a wide range of professionals – data scientists and related experts at Faculty AI, as well as staff from ImpactEd and OpenEducationAI – to create the content store. Being able to apply strategic leadership and my extensive understanding of the education sector to the challenges it faces – hopefully providing a solution to part of those challenges – is exciting. What do you do when you’re not working on the content store? My children are involved in a range of sports, which means that I spend most of my spare time being dad’s taxi. But I like to fit in the (very) occasional round of golf to satisfy my own sporting interest. ![]() Who are we working with?This month, we will be holding the first of our teacher reference-group sessions. We’re currently in conversation with 12 members of teaching staff from nine different educational institutions about participating in this session. The aim of these sessions is for practising teachers from a range of phases of education to offer insight into how they use pedagogical resources. This will allow us to enrich and enhance the data we feed into the content store in a way that will be practically useful for its end users – those same teachers. We will focus specifically on the following areas: Schemes of work and lesson plans: How do teachers develop schemes of work and series of lesson plans, as well as individual lesson plans? What thinking goes into the process? How do they structure it? How do they find relevant resources? Sequencing and assessment: How do teachers sequence student learning, in order to increase students’ understanding and knowledge as they progress through a topic? How do they adapt this over time, following formative assessment? Differentiation: How do teachers differentiate learning for the range of students in each class, in order to meet individual need? How do they cater for students with SEND? The sessions will be discussion-led. Information gathered during the reference-group meetings will be passed to the team of data scientists working on the content store. When teachers then submit content to the store, the reference-group insights will be used when tagging, enriching and enhancing this data. The information the teachers provide, therefore, will ultimately help any AI tools built using the content store to sequence and differentiate learning in the same way that a teacher would, drawing on their lived expertise and experience. Because we want to avoid creating more work for teachers supporting the project by providing content, the reference group will also discuss what the most convenient way would be for them to submit material to the content store. This may be simply attaching a document to an email, or it may be through a submission portal. Rather than assuming we know what teachers want, we will canvass opinion and respond to feedback. ![]() Coming Up...March: Ongoing meetings of multi-academy trust, college, school, edtech and content-provider reference groups. 27 & 28 March: Second Hackathon, in which participants will be testing a larger collection of enriched and optimised public educational content in the Store against a series of assessment and feedback-related use cases ![]() Got a Question for the Core Delivery Team?email us at: content-store-programme@faculty.ai ![]() |