Guidance and support for the use of AI in health and social care and child development
11 February 2026
This blog was originally published in 2024 and has now been updated with new information.
Sarah Ash and Sarah Millington, Health and Social Care and Child Development Subject Advisors


The rapid and ongoing advances in generative artificial intelligence (AI) tools bring both benefits and challenges to education and assessment.
In this blog, we highlight the guidance available for managing AI use in health and social care and child development and how to deal with misuse in assessments.
What is AI?
AI tools generate text or images in response to user prompts and questions. The responses of AI tools are based upon the data sets that they have been trained on. ChatGPT and Microsoft Copilot are the best-known examples of AI chatbots, but many other chatbots and tools are available.
AI is relevant to both health and social care and child development. It is already being used to train staff working within social care in order to tackle workforce shortages, as it can simulate human-like responses.
Within the NHS, AI is being looked at to help with:
- faster and more accurate diagnosis
- reducing errors caused by human fatigue
- assisting with automated repetitive tasks
- reducing costs
- reducing overall mortality rates.
The AI roadmap provides an overview of current use of AI.
The NHS England Transformation Directorate has produced a national strategy for AI in health and social care to show how AI can be implemented and what it could look like.
Appropriate use of AI
Whether the use of AI by students is appropriate for a given task will depend on the marking criteria and nature of the task.
It may be useful as a starting point for ideas and discussion that help towards delivery and understanding content within both health and social care and child development. For example:
- writing exam questions for exam practice
- generating an answer to a question and reviewing the response
- revision.
AI should not be used where the content generated forms part of the criteria for non-examined assessed units (or moderated units). For example, if the criteria asked for a plan for an interview, students shouldn’t generate this using AI.
It is important that any use of AI to help with initial ideas or research should be referenced. Students need to be made aware of this from the outset.
Inappropriate use of AI
Like any form of plagiarism, students to create work which they then try to pass off as their own work. Where a student has used AI to complete all or some of their work, they are not demonstrating their own knowledge, understanding and application of skills. This may prevent the candidate from presenting their own authentic evidence.
Examples of AI misuse include:
- using or modifying AI responses without acknowledgement
- disguising the use of AI
- using it for substantial sections of work.
You can support your students by:
- teaching them about appropriate use of AI in health and social care and child development
- demonstrating how to reference AI correctly where its use is appropriate
- having clear policies for AI use within your department.
If the student has used AI to provide evidence directly assessed in the marking criteria, this cannot be awarded marks. This will also prevent the student from providing their own authentic evidence towards the marking criteria.
Some examples where AI should not be used might be:
- to write an answer to part of or a whole task
- to ask for a response to a specific criterion or some criteria to design a room layout for an early years setting, a creative activity, a support plan or health campaign
- to create a presentation for an assessment
- using AI bots to be a participant in an activity (e.g. communication)
- using AI to create summary notes of research for use in assignment or exam, such as Cambridge Technicals Level 3 Unit 25 pre-release.
Teachers must be aware of a candidate’s use of AI and mark accordingly. If you are unsure whether a student’s use of AI is inappropriate, please contact us. You can share with your students the JCQ guidance on use of AI in non-examined assessment.
What to do when candidates misuse AI in assessments
Through supervision of candidates, you should be able to minimise the use of AI. You must not accept work which is not the student’s own. Ultimately, the Head of Centre is responsible for ensuring that students do not submit inauthentic work.
If you suspect AI misuse before the student has signed the declaration of authentication, your centre doesn’t need to report the malpractice to Cambridge OCR. You can resolve the matter prior to the signing of the declarations.
If you suspect AI misuse within candidate work after formal submission and signing of the authentication sheet, you must report it. Report concerns with a JCQ M1 form, as outlined in the JCQ AI guidance, available on the Malpractice section of the JCQ website. Please email your completed forms to Cambridge OCR at malpractice@ocr.org.uk.
Frequently asked questions
Our frequently asked questions for both Health and Social Care and Child Development highlight key queries that have been asked by centres.
Further support
Please refer to the JCQ AI use in assessments: Protecting the integrity of assessment document for further information on managing the use of AI within your assessments.
We also have a range of support resources, included recorded webinars, on our AI support page.
Stay connected
If you have any questions, you can email us at OCRHealthandSocialCare@ocr.org.uk or call us on 01223 553998. You can also sign up to subject updates to keep up-to-date with the latest news, updates and resources.
About the authors
Sarah Ash joined Cambridge OCR as a subject advisor in 2018. During her time with us she has supported centres with their queries, attended network meetings and contributed to the production of a number of resources. She has been involved in the redevelopment of Cambridge Nationals in Health and Social Care and Child Development and the development of Cambridge Advanced Nationals. Before joining Cambridge OCR Sarah was a teacher of Health and Social Care and a moderator.
Sarah Millington joined Cambridge OCR after teaching Health and Social Care and Child Development over a period of 16 years. Having been a teacher, subject lead and moderator within her career, she has planned and developed subjects to meet the need of her students to allow them to become independent learners, focusing on effective teaching and learning skills. She has experienced and survived several qualification changes: GCSEs to Cambridge Nationals, and A Levels to Cambridge Technicals. In her spare time she enjoys open water sea swimming, travelling and cooking. Pie and cake are key favourites.
Related blogs