Source: Internet Watch Foundation (IWF) published on this website Monday 30 June 2025 by Jill Powell
A new guide from the Internet Watch Foundation (IWF) and National Crime Agency (NCA) warning about a rise in the abuse of AI to create nude and sexual imagery of children has been issued to professionals working with children and young people to help them address the evolving threat.
The guide will equip education practitioners and all those who support children and young people in the UK, such as social workers, youth football clubs, Scout and Guide groups and summer holiday clubs, with essential clarity and information they need to appropriately respond to incidents involving AI-generated child sexual abuse material.
The new guide makes it clear AI child sexual abuse imagery “should be treated with the same level of care, urgency and safeguarding response as any other incidence involving child sexual abuse material” and aims to dispel any misconception that AI imagery causes less harm than real photographs or videos.
Alex Murray, National Crime Agency Director of Threat Leadership and policing lead for artificial intelligence, said:
“AI-generated child sexual abuse material is a threat, with research from the Internet Watch Foundation showing an increase in reporting.
“Generative AI image creation tools will increase the volume of child sexual abuse material available online, creating difficulties with identifying and safeguarding victims due to the vast improvements in how real photos appear.
“Our survey showed that more than a quarter of respondents were not aware that AI-generated CSAM is illegal. A majority of professionals felt that guidance was needed to help them deal with this threat, which is why we’ve worked closely with the IWF to produce this resource. It will help professionals better understand AI, how it’s used to create CSAM and ways of responding to an incident involving children or young people.
“Protecting every single child from harm should matter to everyone. This is why we continue to work closely with partners to tackle this threat and are investing in technology to assist us with CSA investigations to safeguard children.
“Tackling child sexual abuse is a priority for the NCA and our policing partners, and we will continue to investigate and prosecute individuals who produce, possess, share or search for CSAM, including AI-generated CSAM.”
The document, Child sexual abuse imagery generated by artificial intelligence: An essential guide for professionals who work with children and young people, was issued to a network of 38,000 professionals and partners working with children in the UK to raise awareness of the issue and provide information and guidance.
The guide also provides professionals with a step-by-step response for dealing with incidents relating to AI-generated child sexual imagery, such as how best to handle any illegal material and ensuring that victims are given the appropriate support they need.
Tailored versions of the guidance have been created for England, Scotland, Wales and Northern Ireland and will be distributed across the UK, as well as hosted on the IWF and the NCA’s CEOP websites.