All decisions we take are based on research evidence and both teacher and student feedback.
September 2022
We created a small sample of digital questions and sought feedback in teacher focus groups across the country and surveys.
We engaged with over 450 teachers.
Most teachers were positive towards digital exams and the direction of travel. We took all feedback into account and added this to our own internal research.
September 2023
We tested our updated approaches to questions and further tested the systems. We also explored how we could create an IDE environment within our system that would provide an authentic programming experience for students and teachers.
We built the initial IDE to work for Python.
2024
In early 2024 we shared a concept version of our IDE for initial testing and feedback. This received a positive response. We adapted the IDE based on initial teacher feedback.
Later that year, we ran further testing of our IDE. The IDE was tested 'live' with students in a school environment. Students really liked the IDE and we used their feedback to make further improvements.
We also carried out a larger test of a full paper 01. We used a blend of more traditional paper-based style questions (e.g. text box answers and tick boxes) alongside newer item types, like gap-matching and dragging/connection answers.
At the end of the year we tested paper 02 approaches and received invaluable feedback. We incorporated the comments into the next iteration of paper 02.
2025
We are currently testing a full paper 02 and getting feedback on the accessibility features in our platform.