SAFE
CIC
The Safeguarding Specialists
01379 871091

Keeping children safe online: changes to the Online Safety Act explained updated August 1 2025

Source: Department of Science, Innovation and Technology published on this site Monday 4 August 2025 by Jill Powell

The way children experience the internet has fundamentally changed, as new laws under the Online Safety Act have come into force to protect under-18s from harmful online content they shouldn’t ever be seeing. This includes content relating to:

  • pornography
  • self-harm
  • suicide
  • eating disorder content

Ofcom figures show that children as young as 8 have accessed pornography online, while 16% of teenagers have seen material that stigmatises body types or promotes disordered eating in the last 4 weeks.   

To protect the next generation from the devastating impact of this content, people now have to prove their age to access pornography or this other harmful material on social media and other sites.    

Platforms are required to use secure methods like facial scans, photo ID and credit cards checks to verify the age of their users. This means it will be much harder for under-18s to accidentally or intentionally access harmful content. 

It’s clear in Ofcom’s codes that we expect platforms to ensure that strangers have no way of messaging children. This includes preventing children from receiving DMs from strangers and children should not be recommended any accounts to connect with.  

Data privacy

While people might see more steps to prove their age when signing up or browsing age-restricted content, they won’t be compromising their privacy.    

The measures platforms have to put in place must confirm your age without collecting or storing personal data, unless absolutely necessary. For example, facial estimation tools can estimate your age from an image without saving that image or identifying who you are. Many third-party solutions have the ability to provide platforms with an answer to the question of whether a user is over 18, without sharing any additional data relating to the user’s identity. 

 The government and the regulator, Ofcom, are clear that platforms must use safe, proportionate and secure methods, and any company that misuses personal data or doesn’t protect users could face heavy penalties.

Services must also comply with the UK’s data protection laws. The Information Commissioner’s Office (ICO) has set out the main data protection principles that services must take into account in the context of age assurance, including minimising personal data which is collected for these purposes.  

Virtual Private Networks

While Virtual Private Networks (VPNs) are legal in the UK, according to this law, platforms have a clear responsibility to prevent children from bypassing safety protections. This includes blocking content that promotes VPNs or other workarounds specifically aimed at young users.   

This means that where platforms deliberately target UK children and promote VPN use, they could face enforcement action, including significant financial penalties.  

The Age Verification Providers Association (AVPA) reports that there has been an additional 5 million age checks on a daily basis as UK-based internet users seek to access sites that are age-restricted.

Legal adult content

Online Safety laws do not ban any legal adult content. Instead, the laws protect children from viewing material that causes real harm in the offline world, devastating young lives and families.    

Under the Act, platforms should not arbitrarily block or remove content and instead must take a risk-based, proportionate approach to child safety duties.

Protecting freedom of speech

As well as legal duties to keep children safe, the very same law places clear and unequivocal duties on platforms to protect freedom of expression. Failure to meet either obligation can lead to severe penalties, including fines of up to 10% of global revenue or £18 million, whichever is greater. The Act is not designed to censor political debate and does not require platforms to age gate any content other than those which present the most serious risks to children such as pornography or suicide and self-harm content.