SAFE
CIC
The Safeguarding Specialists
01379 871091

Online age checks must be in force from Friday 25 July 2025

Source: Ofcom published on this site Monday 28 July 2025

The changes mean that risky sites and apps – large and small – must use highly effective ‘age gating’ methods to identify which users are children, and then prevent them from accessing pornography, as well as self-harm, suicide and eating disorder content.

Services which allow porn

Ahead of the 25 July deadline, change was already happening. Over the last month, the UK’s biggest and most popular adult service providers – including Porn Hub – plus thousands of smaller sites have committed to deploying age-checks across their services. This means it will be harder for children in the UK to access online porn than in any other OECD country.

Other online platforms have now announced they will deploy age assurance – including Bluesky, Discord, Grindr, Reddit and X.[1]

Ofcom is ready to enforce against any company which allows pornographic content and does not comply with age-check requirements by the deadline. Today we are extending our existing age assurance enforcement programme – previously focused on studio porn services – to cover all platforms that allow users to share pornographic material, whether they are dedicated adult sites or other services that include pornography.

They will be actively checking compliance from 25 July and, should it be necessary, they expect to launch any investigations into individual services next week. These would add to 11 Ofcom investigations already in progress.

Age checks to shield children from other harms

Under the rules, sites that allow other forms of harmful content must also have highly effective age checks from 25 July.

Ofcom are launching a new age assurance enforcement programme, building on work undertaken by the ‘small but risky taskforce,’ to monitor the response from industry. This will specifically target sites dedicated to the dissemination of harmful content, including self-harm and suicide, eating disorder or extreme violence/gore.

Protecting children on the most popular sites and apps

As well as preventing children from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography, Ofcom’s Codes also demand that online services act to protect children from dangerous stunts or challenges, misogynistic, violent, hateful or abusive material, and online bullying.

Even where sites and apps do not technically allow these types of harmful material under their terms of service, Ofcom’s research shows that such content can be all too prevalent. In particular, we know that content recommended in personalised ‘for you’ feeds represents children’s main pathway to encountering these harms. Our Codes are clear, among other things, that algorithms must be tamed and configured for children so that the most harmful material is blocked.

To hold sites and apps to account, Ofcom are today launching an extensive monitoring and impact programme, primarily focused on the biggest platforms where children spend most time – including Facebook, Instagram, Roblox, Snap, TikTok and YouTube. This will include: 

  • a comprehensive review of these platforms’ efforts to assess risks to children, which must be submitted to Ofcom by 7 August at the latest. Ofcom will report on our analysis of these assessments later this year;
  • scrutinising these platforms’ practical actions to keep children safe – details of which must be disclosed to Ofcom by 30 September. We will interrogate in particular: whether they have effective means of knowing who their child users are; how their content moderation tools identify types of content harmful to children; how effectively they have configured their algorithms so that the most harmful material is blocked in children’s feeds; and how they have prevented children from being contacted by adult strangers;
  • tracking children’s online experiences to judge whether safety is improving in practice - through our ongoing programme of children’s research and consulting with children through new work with the Children’s Commissioner for England; and
  • swift enforcement action if evidence suggests that platforms are failing to comply with their child safety duties.

This activity is in addition to the ongoing action to enforce our illegal harms Codes, including measures to protect children from sexual abuse and exploitation online. 

Majority of UK parents supportive of children’s safety measures

New research suggests that a majority of parents believe that the measures set out in Ofcom’s Protection of Children Codes will improve the safety of children in the UK.

Over seven in 10 (71%) feel that the measures overall will make a positive difference to children’s safety online, while over three-quarters (77%) are optimistic that age checks specifically will keep children safer.

Nine in 10 parents (90%) agree that it is important for tech firms to follow Ofcom’s rules, but a significant minority (41%) are sceptical about whether tech firms will comply in practice.

Dame Melanie Dawes, Ofcom’s Chief Executive said: “Prioritising clicks and engagement over children’s online safety will no longer be tolerated in the UK. Our message to tech firms is clear – comply with age-checks and other protection measures set out in our Codes, or face the consequences of enforcement action from Ofcom.”