SAFE
CIC
The Safeguarding Specialists
01379 871091

New rules for a safer generation of children online

Source: Ofcom published on this site Monday 28 April 2025 by Jill Powell

  • Ofcom finalises child safety measures for sites and apps to introduce from July
  • Tech firms must act to prevent children from seeing harmful content
  • Changes will mean safer social feeds, strong age checks and more help and control for children online

Children in the UK will have safer online livesunder transformational new protections finalised by Ofcom .

Ofcom are laying down more than 40 practical measures for tech firms to meet their duties under the Online Safety Act. These will apply to sites and apps used by UK children in areas such as social media, search and gaming. This follows consultation and research involving tens of thousands of children, parents, companies and experts.

The steps include preventing minors from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography. Online services must also act to protect children from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges.

Dame Melanie Dawes, Ofcom Chief Executive, said:

 “These changes are a reset for children online. They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content. Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act they will face enforcement.”

Gen Safer: what we expect

In designing the Codes of Practice finalised on 24 April 2025, Ofcom researchers heard from over 27,000 children and 13,000 parents, alongside consultation feedback from industry, civil society, charities and child safety experts. We also conducted a series of consultation workshops and interviews with children from across the UK to hear their views on our proposals in a safe environment.

Taking these views into account, Ofcom Codes demand a ‘safety-first’ approach in how tech firms design and operate their services in the UK. The measures include:

  • Safer feeds. Personalised recommendations are children’s main pathway to encountering harmful content online. Any provider that operates a recommender system and poses a medium or high risk of harmful content must configure their algorithms to filter out harmful content from children’s feeds.
  • Effective age checks. The riskiest services must use highly effective age assurance to identify which users are children. This means they can protect them from harmful material, while preserving adults’ rights to access legal content. That may involve preventing children from accessing the entire site or app, or only some parts or kinds of content. If services have minimum age requirements but are not using strong age checks, they must assume younger children are on their service and ensure they have an age-appropriate experience.
  • Fast action. All sites and apps must have processes in place to review, assess and quickly tackle harmful content when they become aware of it.
  • More choice and support for children. Sites and apps are required to give children more control over their online experience. This includes allowing them to indicate what content they don’t like, to accept or decline group chat invitations, to block and mute accounts and to disable comments on their own posts. There must be supportive information for children who may have encountered, or have searched for harmful content.
  • Easier reporting and complaints. Children will find it straightforward to report content or complain, and providers should respond with appropriate action. Terms of service must be clear so children can understand them.
  • Strong governance. All services must have a named person accountable for children’s safety, and a senior body should annually review the management of risk to children.

These measures build on the rules Ofcom have already put in place to protect users, including children, from illegal online harms – including grooming. They also complement specific requirements for porn sites  to prevent children from encountering online pornography.

What happens next

Providers of services likely to be accessed by UK children now have until 24 July to finalise and record their assessment of the risk their service poses to children, which Ofcom may request. They should then implement safety measures to mitigate those risks. From 25 July 2025, they should apply the safety measures set out in our Codes to mitigate those risks.

If companies fail to comply with their new duties, Ofcom has the power to impose fines and – in very serious cases – apply for a court order to prevent the site or app from being available in the UK.

Today’s Codes of Practice are the basis for a new era of child safety regulation online. We will build on them with further consultations, in the coming months, on additional measures to protect users from illegal material and harms to children.