SAFE
CIC
The Safeguarding Specialists
01379 871091

SAFE Newsfeed

Child sexual abuse material vs ‘child porn’: why language matters

Source: Internet Watch Foundation(IWF) published on this website Friday 4 July 2025 by Jill Powell

In 2024, the Internet Watch Foundation confirmed 291,273 reports of child sexual abuse material (CSAM) online, the highest number ever recorded. That’s nearly 800 reports a day, each containing content that shows the sexual abuse of a child.

Yet the phrase often still used to describe this content is ‘child pornography.’

This article will break down what CSAM is, what makes it different, and why linguistics matters in the fight to protect children. We’ll also explore the legalities behind CSAM, the growing role of technology, what’s being done to stop this abuse, and how you can be part of the solution.

What is ‘child porn’?

There is No Such Thing as ‘child pornography’. This phrasing implies consent and even legality, as pornography is legal in most countries for consenting adults. But children cannot legally or ethically consent to sexual activity, and they can never be complicit in their abuse.

That’s why the Internet Watch Foundation (IWF) is calling for an end to the outdated and damaging use of the phrase ‘child pornography’. It’s time to use language that reflects the true nature of these acts and respects the victims of these crimes.

While the term is still used in some legislation and media, it’s not the right language. What people often call ‘child porn’ is more accurately known as child sexual abuse material (CSAM).

What is CSAM (child sexual abuse material)?

CSAM includes any images or videos that show the sexual abuse or exploitation of a child.

CSAM takes many forms. Sometimes it’s the result of grooming, where someone builds trust with a child online and then manipulates them into sharing explicit images. In other cases, it involves sexually coerced extortion (sometimes called ‘sextortion’), which is when a child is blackmailed into sending more imagery or money under threat.

And now, with the rise of technology, some CSAM is AI-generated but disturbingly realistic. Even if no real child was directly involved in the sexual abuse, these images still feed demand and normalise abuse, especially when the AI models have been trained on images of real children.

In most countries, creating, sharing or viewing child sexual abuse material (CSAM) is a serious criminal offence. But beyond the law, it’s about protecting children and treating them with the care and respect they deserve. Using the term CSAM helps us focus on what matters: stopping abuse and standing up for children’s safety and dignity.

CSAM vs ‘child porn’: why language matters

The words we use matter. When people say ‘child porn,’ it can sound almost like a category of adult content, but it’s evidence of a child being abused. 

This has real-world consequences. During Operation Ore, a major UK police investigation into people accessing child abuse images, media reports often incorrectly used the phrase ‘child porn.’ The result was sensationalism, and in some cases, even less public empathy for victims. It blurred the reality of what those images represented. In some historical cases, courts have handed down lighter sentences because the material was framed as pornography rather than what it truly is: abuse.

That’s why we use the term CSAM, or child sexual abuse material, because children cannot consent to their own abuse. By avoiding the phrase ‘child porn’ and using clear, accurate language like CSAM, we put responsibility where it belongs: on the offender.

When people say ‘child porn,’ it can sound almost like a category of adult content, but it’s evidence of a child being abused.

Is CSAM illegal?

Yes, CSAM is illegal. Across the globe, it is criminalised under international agreements like the UN Convention on the Rights of the Child. Laws vary slightly, but the core message is the same: children must be protected from exploitation, and CSAM is a direct violation of their rights and safety.

Every image or video showing CSAM documents a moment of exploitation. It’s not just illegal because it’s disturbing or offensive; it’s illegal because it records a crime. That material often continues to circulate online for years, repeatedly victimising the person every time it’s viewed, shared, sold or downloaded. This ongoing harm is part of why most countries take such a strong stance against it.

Unfortunately, not every country enforces these laws equally. Some developing nations have weaker legal frameworks, limited law enforcement capacity or outdated definitions that don’t fully cover online content. In certain places, possession might not be explicitly criminalised, or enforcement may be inconsistent, making it harder to prosecute offenders or protect victims effectively.

That said, there is a growing global effort to strengthen laws and improve cooperation between countries.

What is being done to stop CSAM?

Thankfully, powerful work is being done to stop the spread of CSAM. At the Internet Watch Foundation (IWF), we are one of the key organisations leading this effort.

Our latest report highlights a sharp rise in material where children have often been groomed or coerced online. To combat this and other types of CSAM, the IWF uses advanced tools like Image Intercept technology to identify and block known abuse material before it can even be uploaded.

Organisations like INTERPOL, ECPAT and INHOPE also work with governments worldwide to detect, report and remove CSAM and bring offenders to justice.

Want to be part of the solution? Whether you’re an individual or a company, you can help fund vital work and raise awareness. Visit the IWF’s Support Us page to get involved and, if you work for an organisation that handles online images and videos, consider becoming a Member.

What you can do to protect children and help survivors

If a child is in immediate danger, call 999 or the police emergency phone number in your country

Understanding that child sexual abuse material (CSAM) is evidence of real abuse is a critical first step. Once you know that, what else can you do to help?

If someone, child or adult, has experienced child sexual abuse, it’s vital to respond with compassion, urgency and belief. Learn to spot the signs of sexual abuse: sudden changes in behaviour, withdrawal, anxiety or physical signs that don’t seem right. Trust your instincts and speak up.

Who to contact:

  • IWF: If you’ve encountered online material that involves child sexual abuse, report it anonymously here.
  • NSPCC: If you’re worried about a child, the NSPCC is here to listen. You can call their helpline 24/7 at 0808 800 5000 or visit their website for advice and support.
  • Marie Collins Foundation: The Marie Collins Foundation offers support to anyone harmed by technology-assisted child sexual abuse which includes online child sexual abuse.

Photographer jailed after sexually assaulting two models

Source: Attorney General's Office published on this site Thursday 3 July 2025 by Jill Powell

Wayne Glover-Stuart [36] from Chiswick, West London, has had his suspended sentence overturned and jailed for three years after the Solicitor General Lucy Rigby KC MP referred his case to the Court of Appeal.  

The court heard that Glover-Stuart, a former theatre producer, invited two men on separate occasions to an underwear modelling photoshoot.  

During both incidents, Glover-Stuart touched the victims’ genitals before carrying out sexual assaults.  

The Solicitor General Lucy Rigby KC MP said:  

Glover-Stuart’s crimes were appalling. He lured his victims into a vulnerable position abusing their trust before sexually assaulted them for his own gratification. 

I welcome the Court of Appeal’s decision to increase this offender’s sentence following my intervention.

Wayne Glover-Stuart was sentenced to two years’ imprisonment, suspended for two years, for sexual assault and causing a person to engage in sexual activity without consent, on 16 April 2025 at the Inner London Crown Court. 

On 1 July 2025, Glover-Stuart’s suspended sentence was quashed and jailed for three years after it was referred to the Court of Appeal under the Unduly Lenient Sentence scheme.

Amnesty launched as part of mission to halve knife crime

Source: Home Office published on this website Tuesday 1 July 2025 by Jill Powell

With the support of Word 4 Weapons and FazAmnesty, young people will be able to anonymously hand in any weapons to surrender bins or a purpose-built and fully secure van, across London, Greater Manchester and West Midlands – the 3 highest areas for knife crime in England.

Part of the government’s most ambitious surrender scheme yet and Plan for Change, the 37 new amnesty bins and the locations of the mobile surrender van will be strategically placed in these high-risk areas throughout July, in partnership with local councils, to provide young people with an accessible, alternative way to hand in weapons without needing to go to a police station.  

Throughout the month the government’s Coalition to Tackle Knife Crime and other grassroots organisations will be using their platform as trusted voices in communities to encourage young people to hand in their weapons via these routes, while signposting them to local support services.

From 1 August 2025, deadly ninja swords will be banned in full – illegal to possess in public or private – and so, in addition to the surrender arrangements across the 3 hotspot areas, people will also be able to hand in ninja swords to designated police stations across the country.  

Policing Minister Dame Diana Johnson said: 

The launch of today’s scheme is a result of months of collaborative working with the Coalition to Tackle Knife Crime and I’m optimistic about what we can achieve together over the next month and then the years to come as part of our Plan for Change. 

I am incredibly grateful to Pooja Kanda, Sandra Campbell and Faron Paul whose work to tackle knife crime is making a real difference to young peoples’ lives. 

This scheme is just one part of addressing knife crime. We will not stop listening to those who are directly working with those impacted by this crime.

The scheme has been designed to provide people with a range of ways to hand in weapons outside of police stations. Word 4 Weapons and FazAmnesty, both members of the government’s Coalition to Tackle Knife Crime, have a proven track record in supporting young people to surrender dangerous weapons and directing them towards local support.  

The ninja sword surrender and compensation scheme will also be running in tandem throughout July in police stations across England and Wales. The ban on ninja swords is a result of the tireless campaigning of the Kanda family, who tragically lost their son Ronan in 2022 when he was killed with one of these deadly weapons. The ban on ninja swords is part of Ronan’s Law which was introduced to Parliament this year and includes measures to stop the illegal sale of knives online. Ronan’s Law will be included in the Crime and Policing Bill.  

Members of the public wishing to surrender a ninja sword in exchange for compensation should take them to their local police station. Ninja swords can also be surrendered in any available surrender bin however this will not result in any compensation. Full details about how to claim compensation for ninja swords can be found on GOV.UK or via local police.  

Ofsted publishes research on vulnerability commissioned from the National Children’s Bureau

Source: Ofsted published on this website Wednesday 2 July 2025 by Jill Powell

Ofsted has today published a research report it commissioned from Research in Practice at the National Children’s Bureau (NCB). Research in Practice undertook an evidence review to explore key messages from policy and research before holding 2 phases of stakeholder focus groups. Over 400 participants took part, including professionals from all the sectors Ofsted inspects, young people, parents and carers, and Ofsted staff.

Read the full report: From trait to state: how Ofsted might consider conceptualising vulnerability for inspection and regulation.

Sir Martyn Oliver, Ofsted’s Chief Inspector, said:

I am grateful to Research in Practice at the National Children’s Bureau for carrying out this research for us. It provides useful insight as we continue to develop our work related to vulnerability in children and learners.

I am committed to putting disadvantaged and vulnerable children at the heart of everything we do.

Dez Holmes, Director of Research in Practice, said:

We are hugely grateful to over 500 colleagues across the country for contributing to this fascinating project. The rich expertise of early years, education, further education and social care professionals has been invaluable in helping us at Research in Practice to think through what vulnerability means for children, young people and families. 

We appreciate the opportunity Ofsted provided. It is rewarding to do work that is explicitly conceptual, whilst potentially being able to influence practice and policy.  

The work colleagues do across the education and social care sector is as vital as it is complex. Vulnerabilities are varied and affect everyone in different ways. We are delighted to have been able to support critical thinking and reflection.

Essential guidance on AI generated child sexual abuse material launched Friday 27 June 2025

Source: Internet Watch Foundation (IWF) published on this website Monday 30 June 2025 by Jill Powell

A new guide from the Internet Watch Foundation (IWF) and National Crime Agency (NCA) warning about a rise in the abuse of AI to create nude and sexual imagery of children has been issued to professionals working with children and young people to help them address the evolving threat.  
 
The guide will equip education practitioners and all those who support children and young people in the UK, such as social workers, youth football clubs, Scout and Guide groups and summer holiday clubs, with essential clarity and information they need to appropriately respond to incidents involving AI-generated child sexual abuse material. 

The new guide makes it clear AI child sexual abuse imagery “should be treated with the same level of care, urgency and safeguarding response as any other incidence involving child sexual abuse material” and aims to dispel any misconception that AI imagery causes less harm than real photographs or videos. 

Alex Murray, National Crime Agency Director of Threat Leadership and policing lead for artificial intelligence, said:

 “AI-generated child sexual abuse material is a threat, with research from the Internet Watch Foundation showing an increase in reporting.

“Generative AI image creation tools will increase the volume of child sexual abuse material available online, creating difficulties with identifying and safeguarding victims due to the vast improvements in how real photos appear.

“Our survey showed that more than a quarter of respondents were not aware that AI-generated CSAM is illegal. A majority of professionals felt that guidance was needed to help them deal with this threat, which is why we’ve worked closely with the IWF to produce this resource. It will help professionals better understand AI, how it’s used to create CSAM and ways of responding to an incident involving children or young people.

 “Protecting every single child from harm should matter to everyone. This is why we continue to work closely with partners to tackle this threat and are investing in technology to assist us with CSA investigations to safeguard children.

“Tackling child sexual abuse is a priority for the NCA and our policing partners, and we will continue to investigate and prosecute individuals who produce, possess, share or search for CSAM, including AI-generated CSAM.”

The document, Child sexual abuse imagery generated by artificial intelligence: An essential guide for professionals who work with children and young people, was issued to a network of 38,000 professionals and partners working with children in the UK to raise awareness of the issue and provide information and guidance.

The guide also provides professionals with a step-by-step response for dealing with incidents relating to AI-generated child sexual imagery, such as how best to handle any illegal material and ensuring that victims are given the appropriate support they need.
 
Tailored versions of the guidance have been created for England, Scotland, Wales and Northern Ireland and will be distributed across the UK, as well as hosted on the IWF and the NCA’s CEOP websites.