The Safeguarding Specialists
01379 871091

If you, or someone you know, have been affected by illegal, harmful or upsetting content online, please contact a support service to find information and get help.

Harmful online content: How to report it and where to get help

The Online Safety Act 2023

The act is published here, and please also see the government guidance Online Safety Act: explainer.

This act is a major piece of legislation to protect children and adults online by making social media platforms:

  • remove illegal content quickly or prevent it from appearing in the first place. This includes removing content promoting self harm
  • prevent children from accessing harmful and age-inappropriate content
  • enforce age limits and age-checking measures
  • ensure the risks and dangers posed to children on the largest social media platforms are more transparent, including by publishing risk assessments
  • provide parents and children with clear and accessible ways to report problems online when they do arise

The last point is important, as the expectation is that platforms will take responsibity for sorting out individual complaints relevant to their own sites. It is only if they fail to do so that Ofcom may get involved.

Platforms must remove illegal content including:

  • child sexual abuse
  • controlling or coercive behaviour
  • extreme sexual violence
  • fraud
  • hate crime
  • inciting violence
  • illegal immigration and people smuggling
  • promoting or facilitating suicide
  • promoting self harm
  • revenge porn
  • selling illegal drugs or weapons
  • sexual exploitation
  • terrorism

Platforms will need to protect children from content that is not illegal, but could be harmful or age-inappropriate.

These categories include:

  • pornographic content
  • content that does not meet a criminal threshold but which promotes, encourages or provides instructions for suicide, self-harm or eating disorders
  • content that depicts or encourages serious violence
  • bullying content

Platforms are now regulated by Ofcom. Platforms will have to show they have processes in place to meet the requirements set out by the Bill. Ofcom will check how effective those processes are at protecting internet users from harm. Ofcom will not respond to or investigate individual complaints. Instead Ofcom will use complaints as one of their parameters when assessing the effectivenes of the processes in place.

Penalties include being fined up to £18 million or 10 percent of their annual global turnover, whichever is greater. Criminal action will be taken against senior managers who fail to follow information requests from Ofcom. Companies and senior managers (where they are at fault) are criminally liable if the provider fails to comply with Ofcom’s enforcement notices in relation to specific child safety duties or to child sexual abuse and exploitation on their service.

In the most extreme cases, with the agreement of the courts, Ofcom will be able to require payment providers, advertisers and internet service providers to stop working with a site, preventing it from generating money or being accessed from the UK.