Guidance and support for the use of AI in Cambridge Technicals in Digital Media
02 February 2026
John Hibbert, Media Subject Advisor

The rapid and ongoing advances in generative artificial intelligence (AI) tools bring both benefits and challenges to education and assessment.
In this blog, we highlight the guidance available for managing AI use in Cambridge Technicals in Digital Media and how to deal with misuse in assessments.
What is AI?
AI tools generate text or images in response to user prompts and questions. The responses of AI tools are based upon the data sets that they have been trained on. ChatGPT is the best-known example of an AI chatbot, but many other chatbots and tools are available.
AI use in digital media
Some uses of AI are already well-established within media industries, including its use in personalised content recommendation systems. Generative AI use is becoming increasingly widespread across different media sectors. This includes its use to write news stories and in a wide range of media production processes such as in the creation of visual effects in film and television. AI tools are becoming widely integrated into media production software such as Photoshop. AI is becoming a key part of media production and post-production across sectors including publishing, film and video production, animation and video games.
AI tools
AI platforms such as ChatGPT, Microsoft Copilot and Google Gemini all offer opportunities for helping students explore relevant concepts and content for digital media, but carry risks of misuse for work submitted for the internally assessed units.
Platforms focused specifically on the creation of images or video such as Midjourney, DALL-E, Stable Diffusion and Deforum could also be misused by students when creating production work.
Many existing programs and platforms that students use for their production work, including Canva and Adobe Creative Cloud apps like Photoshop, Premiere Pro and Firefly, incorporate AI tools.
Appropriate use of AI
Whether the use of AI by students is appropriate for a given task will depend on the marking criteria and nature of the task.
Use of AI for examined units
Most appropriate uses of AI for Cambridge Technicals in Digital Media will relate to preparing students for externally assessed units and supporting general teaching and learning. These might include:
- to support homework
- flipped learning activities
- note-taking
- research
- creating glossaries of key terms.
Students could use AI to research the purpose and format of different pre-production documents for Unit 2, or to provide explanations of key concepts such as connotation or theoretical ideas for Unit 1. It’s worth bearing in mind that information provided by AI can include errors, so should be checked before sharing with your students.
Use of AI in internally assessed units
For internally assessed units, AI could be used as a research tool. For example, students could use AI as a research source to explore the conventions of the media product they are going to create in Unit 3.
AI tools are included in a range of software and platforms, such as Photoshop, which are used in production work. Students can use these tools to edit their own original content as long as doing so will allow them to independently demonstrate they have met the requirements of the grading criteria.
If students do use AI when producing work for assessment, they must acknowledge its use and clearly indicate how they have used it .
Inappropriate use of AI
Like plagiarism, AI can be used by students to create work which they then try to pass off as their own. Where a student has used AI to complete all or some of their work, they are not demonstrating their own knowledge, understanding and application of skills. This may prevent the candidate from presenting their own authentic evidence.
Examples of AI misuse include:
- using or modifying AI responses without acknowledgement
- disguising the use of AI
- using it for substantial sections of work.
You can support your students by:
- teaching them about appropriate use of AI in Digital Media
- demonstrating how to reference AI correctly where its use is appropriate
- having clear policies for AI use within your department.
Inappropriate uses of AI in internally assessed units could include:
- using it to generate written content, for example replicating content provided by AI to describe the work of professional photographers in LO1 of Unit 8
- using AI to create required planning and pre-production documents
- using AI-generated content in media products in place of original content
- submitting media products created by AI.
Use of AI in production work
Students should avoid any use of AI tools in production work which results in them not being able to show they have independently met the required criteria.
AI-generated content used in production work, such as images, text or footage, does not constitute original material and cannot be credited as such when work is assessed. If AI-generated content is used in production work where original content is required, this content cannot be credited.
If AI content is used where content does not need to be original, and the use of AI is appropriately referenced in line with the JCQ guidance, this would be acceptable.
What to do when candidates misuse AI in assessments
Teachers must not accept work which is not the student’s own. Ultimately the Head of Centre is responsible for ensuring that students do not submit inauthentic work.
If you suspect AI misuse before the student has signed the declaration of authentication, your centre doesn’t need to report the malpractice to Cambridge OCR. You can resolve the matter before signing the declarations.
If you suspect AI misuse within candidate work after formal submission and signing of the authentication sheet, you must report it. Report concerns with a JCQ M1 form: see the JCQ AI guidance, available on the Malpractice section of the JCQ website. Please email your completed forms to OCR at malpractice@ocr.org.uk.
Further support
Please refer to the JCQ AI use in assessments: Protecting the integrity of assessment document for further information on managing the use of AI within your assessments.
We also have a range of support resources, included recorded webinars, on our AI support page.
Stay connected
If you have any questions, you can email us at media@ocr.org.uk or call us on 01223 553998. You can also sign up for subject updates for the latest news, resources and support.
Thinking of teaching any of our qualifications? Use our online form to let us know, so that we can help you with more information.
About the author
John Hibbert has been Subject Advisor for Media and Film Studies since 2018. Before joining Cambridge OCR John taught a range of media and film studies qualifications in secondary schools, and was a head of department for the last eight years. Predictably, in his spare time he is a keen filmgoer, and in addition enjoys reading and miserable indie music.