May 25' Update
In this edition:
Schools and Teachers Stepping Up their Professional Development
Foundational Impact Podcast: Sharing from Computing at School and Belgrave St. Bartholomew’s Academy
Good Future Foundation Turns One!
Good Future Foundation has just celebrated its first year, and we're excited to share our Impact Report with you!
The report highlights how we've collectively explored the new territory of Artificial Intelligence in education over the past 12 months. From our humble beginnings to now reaching educators across different regions, and even countries, sparking important conversations and establishing networks of best practices of responsible use of AI in schools.
As we embark on our second year, we're more committed than ever to supporting educators navigating the evolving landscape of AI in education. Your continued engagement makes our mission possible. We welcome your thoughts on the report and would love to hear how we might better serve our mission in the coming year.
23andMe / AI / Data
AI promises easier administrative tasks for teachers, but at what cost does it affect student data privacy and intellectual property?
Yom Fox, Principal at Georgetown Day School in Washington DC, and I caught up at the Institute of Global Learning conference in Miami last month, and we talked through the news about 23andMe filing for bankruptcy and the bigger question of what happens to sensitive personal data when companies go under? That conversation quickly led us into another topic we’re both passionate about: the growing use of AI tools in UK/US schools and the serious concerns around student data collection.
If a company like 23andMe can file for bankruptcy and leave millions of people wondering what happens to their genetic information, what about edtech companies we are trusting with student data?
Yes, companies like OpenAI offer clear privacy policies... Similarly, 23andMe had robust protections stating they don't use customer data for training by default, employ encryption standards, and retain information for limited periods. Similarly, 23andMe implemented robust protections including SSL encryption, opt in research programs, and compliance with international privacy frameworks. They even stated that any prospective buyer should comply with their privacy policy.
And yet, despite all of that, 23andMe still suffered a major data breach in 2023 that exposed sensitive personal information for millions of users, showing that strong policies on paper don’t always translate to strong protections in practice.
In my work at Good Future Foundation, I'm learning more about ways schools are adopting AI solutions that promise to save teachers time through marking and grading. But what happens to all that student information if these companies change hands or close down?
This is especially concerning since these tools collect personal information about children. Children might be writing essays, completing activities, while having their content uploaded to these systems daily, creating valuable data profiles.
These profiles could contain insights into a child's learning patterns, emotional responses to challenges, subject preferences, and even indicators of neurodiversity or learning difficulties that haven't been formally diagnosed. Such detailed information, if misused, could affect a student years later.
For instance, writing samples processed through AI systems could theoretically be used to build personality assessments that might influence future employment screening algorithms, especially as more companies adopt AI recruitment tools that analyse patterns.
Of course, there are also positive aspects to collecting specific information, such as creating personalised learning profiles that model and adapt content to differentiate work. But there are considerations that need addressing…
While UK schools must comply with GDPR under the supervision of the Information Commissioner's Office, the U.S. has no single law or enforcement body offering the same protections. Many teachers tell me - and, as Yom shared, in the US - they don't fully understand what data the technology they use is gathering, where it's stored, or how long it's kept. This is totally fair - when have folks had the time to learn about this since AI bombarded its way into our lives?
Before bringing AI tools into classrooms, school leaders need clear security protocols and commitments about data handling. This is particularly important for applications built on OpenAI's API or similar services, as despite their current policies, acquisition, or regulatory changes, could alter how student data is handled in future. If there are processes in place to protect against this, what are they and how can they be verified?
Better privacy protection methods totally exist. Techniques like Differential Privacy can analyse data without exposing individual information by adding calculated "noise" to the data while still preserving its statistical usefulness. (Apple Differential Privacy Overview, Harvard Privacy Tools Project)
I’m also working with a school in Oxford where they created a simple checklist for vetting new educational technology and now require all vendors to answer their data privacy questions in writing before purchasing.
When we choose technology for our classrooms, we're making decisions that can impact students' privacy for years to come. The 23andMe situation reminds us that even established companies with millions of users and strong privacy policies can face uncertain futures and security challenges.
Asking direct questions about data practices often reveals how seriously a company takes these concerns. Our students may not yet grasp the full implications of their digital footprint, but we should spend more time thinking about how we can make the best choices on their behalf.
Last but not least, I want to thank Yom for her continued input, inspiration and thought leadership as we navigate these complex challenges together!
Pedagogy, Not Technology
In short:
Goal → pedagogy → tool: treat AI as the means, not the end.
Three visions—Conservative, Baseline, AI‑Forward—show how schools can benefit today, soon, and one‑day.
By the Varian Rule, learning experiences that once belonged to the 1 % will reach every classroom.
The scarce resource in classrooms of tomorrow is teachers' understanding of their students' needs.
Prompt packs, chatbots, and dashboards sell the fantasy that tools alone transform learning. In a lot of the social media discourse I see about AI for teachers, the tool becomes the destination, and pedagogy retrofits around its quirks. But the correct steps to think about AI in education is surely to 1) start by defining learning goals or problems to solve based on teachers’ understanding about their students, 2) choosing the instructional design, then 3) inviting AI in as helper.
As if that wasn’t enough work already, there should also be a step zero to agree on how much we want to change teaching and learning to accommodate AI. Below, I’m throwing out three visions to set the stage for something better, on how that change could play out at increasing levels of ambition.
Relieve teachers’ burden
An Apr ‘25 survey shows GenAI users save at least 30 min/day because they spend less effort drafting and more on verification. More experienced users can save even more time while delivering better results. Have teachers picked all the low hanging fruits already?Personalise learning
38% of teachers in the UK used AI to create lesson content, model answers, and for lesson planning. Are they taking advantage of the speed and ease of AI to generate differentiated sets of materials for each lesson, or using more frequent pop quizzes to surface misconceptions before they fossilise?
Cognitive pivot: verification & integration
A Microsoft study finds that “GenAI shifts the nature of critical thinking toward information verification, response integration, and task stewardship.” In practice that means teaching students a three‑step fact‑check protocol (source, corroborate, reflect) every time an AI draft appears.
Differentiate AI‑assisted vs. independent work
Anthropic’s million‑chat analysis shows student usage patterns ranging from quick answer‑grabs to iterative co‑creation (an echo of Vygotsky’s social‑constructivist “more capable peer”, now in silicon). That means a two-tiered assessment framework to test solo competence where it matters, while new rubrics might score AI‑assisted work for prompt quality and iterative refinement, not just final prose.
Toward universal Two‑Sigma gains
The Two‑Sigma problem is Bloom’s 1984 finding that 1‑to‑1 tutoring moves the average student two standard deviations above the conventional class mean, but cost made it elusive. A World Bank pilot let Nigerian pupils use GPT‑4 twice a week and delivered roughly two years of learning gains in just six weeks. Technology now offers a plausible path to scale 1‑to‑1 tutoring for all.Teachers’ new edge — curating the environment
When pacing, practice, and instant feedback are automated, teachers’ irreplaceable value lies in crafting supportive, inquiry‑rich cultures where students negotiate meaning, ethics and identity—spaces no algorithm can reproduce. This is the long‑promised guide‑on‑the‑side role that we have discussed for decades, except now we actually have the tools to make it happen.
Say No to AI for the Sake of AI
Whatever vision makes sense to you, begin with what only you know best: your students’ aspirations, barriers and quirks. Some classes will thrive on multimedia storytelling, others on adaptive practice, others on rigorous debate. Do students need scaffolds, drills, or space to wrestle with ambiguity? These are the questions we need to ask to put pedagogy in the driver’s seat.
AI can already draft, translate, storyboard, and simulate, and soon it will do more we haven’t imagined. In that world, insight into what learners truly need is the scarce resource that only schools and teachers can provide.
Talking to a Tireless Examiner: Speech‑to‑Speech AI Makes Oral Practice Universal
In Short:
Voice chatbots now give every student unlimited, low‑stress speaking practice
One carefully written prompt plus a bank of sample questions is all you need
Teachers can export full transcripts to mine vocabulary range and spot patterns
The same recipe scales to job‑interview or French‑oral coaching too
Oral‑practice lessons have always been a squeeze: one teacher, many voices, and never enough time on the timetable. When a student of mine shared his anxiety about taking the IELTS speaking test, I whipped up a conversational AI to simulate a discussion with the examiner (link to try below). To engage an older, unruly class that usually refuses to speak English, I also made another voice bot that simulates a first date (which made them talk an unprecedented amount), but that’s a story for another time.
Using speech AIs in language lessons is one of the most obvious and high-impact AI-in-education use cases I’ve come across, and teachers can even analyze the transcripts for assessment and to improve teaching.
Building the Chatbot
Play.ai’s Agent function was used to create the chatbot, and there were just two ingredients:
Exam questions – A bank of IELTS Part‑3 discussion prompts (generated with AI, then spot‑checked).
A system prompt – Used to define the AI’s behaviour, attached below.
Students can just open the website and click to start speaking. You can try it out by clicking this link:
Benefits and Reflection
This AI use case ticks a lot of boxes, for students and teachers alike:
Learners get more practice time and feedback. Because an AI listener feels less intimidating than a teacher or peer, they may also try riskier vocabulary or strategies they may not normally use in class.
However, I wish there could be more visual cues to also reinforce non-verbal communication and make the practice more authentic, even if it’s just a cartoon avatar.
Teachers get better data about how students are speaking. Each session generates a transcript which can be analyzed to spot filler words, over‑used linkers, or weak verb tenses. There’s a lot more that can be done, like profiling individual growth by comparing first and latest attempts and pulling illustrative quotes for in‑class feedback without relying on memory.
However, I wish the actual conversation audio could also be extracted. I know the technology already exists to analyze pronunciation, so these can be combined to give both content and pronunciation feedback.
If chatting with the AI can be assigned as homework, valuable live lessons can be repurposed. I’d like to see the bot handle Q&A while classroom time shifts to strategy: turn‑taking, discourse markers, and peer critique—areas where human nuance still rules.
Where this leads
Swap the question bank and the system prompt, the same setup can power mock job interviews, scholarship panels, or oral defence. As next‑gen speech models become more authentic and gain the ability to understand tone and emotion, the experience can only get better.
That said, I do worry about emotional dependence, and at some point we may decide that there’s no need for full authenticity—just like VR training/education research which shows that there’s no benefit to highly realistic visuals, because cognitive realism and abstracted environments help learners focus on essential information, improving comprehension and retention.
Schools and Teachers Stepping Up Their AI Professional Development
We're delighted to see more schools exploring AI's impact with their teachers through various approaches. Whether it's establishing an AI core group for SLT discussions, supporting teachers through online CPD courses, or dedicating workshops during INSET Days - all these initiatives represent meaningful first steps in bringing AI into educational conversations. The Good Future Foundation is here to provide tailored support that suits each school's unique needs.
Regional AI Professional Development Days
Understanding that London-centered training can be challenging for many teachers, we're bringing professional development directly to different regions. We're grateful to our host schools for opening their doors to educators in their local areas.
Upcoming dates:
All events are FREE to attend. Want to see what it’s like? Watch our video from a recent event in Stoke-on-Trent to get a better idea.
AI Workshop for School INSET Day
Trust and school INSET Days provide excellent opportunities to engage entire staff teams in school priorities. This spring, we're working with Wymondham College and Amadeus Primary Academies during their INSET Days to help initiate AI conversations and explore its impact on education and students' futures. Several trusts have already scheduled sessions for the next academic year. As a non-profit, we deliver these sessions free of charge. If you're interested, please contact us early to secure your date.
Online CPD Drops
Has your school not yet explored AI, but you as a teacher want to learn more? We're partnering with STEM Learning to make AI CPD accessible to all teachers through online CPD drops this summer.
Following our Listen First approach, we've conducted focus groups with primary and secondary teachers to understand successes, challenges, and support needs. We want to hear from more teachers: What gaps exist? What additional support would help? What would boost your confidence in using AI in your classroom?
Please take 5 minutes to share your thoughts with us. Your input will directly shape the CPD, resources, and support we develop through this partnership.
Join Us at the Festival of Education
Mark your calendars for 3-4 July! We’re taking part in the Festival of Education at Wellington College with an interactive AI Maker Space designed for hands-on exploration. Step into booth where you can test AI tools and discuss how responsible AI use might enhance teaching tailored to your specific subject area and student needs.
In a special Festival highlight, we’ll be launching our new educator community platform, a digital space where teachers can exchange implementation stories, find inspiration, and connects with peers navigating similar AI journeys. The platform will feature curated resource collections and dedicated discussion spaces for various education levels and disciplines.
Get a Festival ticket for Free!
Early adopters, here's your chance to attend the Festival for free! We're giving away complimentary Festival of Education tickets to educators who contribute AI teaching resources to our community platform before 31 May. Whether you've created lesson plans, assessment tools, or classroom activities utilising AI, sharing your trials and tribulations could earn you entry to this educational event.
Simply register through the link below, upload your resources to our beta platform, and you could be joining us at Wellington College this summer! All contributions will be shared with our growing community of educators, expanding your impact beyond your classroom. Don't miss this opportunity! Submit your resources today and we'll send the Festival ticket your way!
Research on AI and disinformation and misinformation
We are currently conducting research examining how AI technologies are changing the landscape of disinformation and misinformation. Our study specifically investigates the implications for educational communities and how these technological shifts are affecting critical thinking development and information literacy among students and teachers alike.
Through this research, we aim to equip educators to understand both the mechanisms behind AI-powered false information and the evolving strategies teachers need to equip students with digital discernment skills.
If you are interested and passionate in this subject and would like to contribute to this research, please contact us to learn more about involvement opportunities and how you can play a role in this essential work.
Foundational Impact Podcast: Sharing from Computing at School and Belgrave St. Bartholomew’s Academy
Two recent episodes released include sharing from Becci Peters and Ben Davies of Computing at School (CAS), who discuss their observations of the challenges with AI implementation and digital divides, based on their efforts supporting computing teachers. In another episode, we feature the leadership team from Belgrave St Bartholomew's Academy, who kindly share their journey of meaningful technology integration and how they’re working to engage parents in supporting their digital strategy.

