SAFE
CIC
The Safeguarding Specialists
01379 871091

SAFE Newsfeed

Almost half of 8 to 17-year-olds have been scammed online

Source: UK Safer Internet Centre (UKSIC) published on this website Wednesday 5 March 2025 by Jill Powell

For Safer Internet Day 2025 we conducted some brand new research into young people's experiences of scams online. This research found that:

  • 79% of young people come across online scams at least monthly, with 20%, including children as young as 8, seeing scams online every day.
  • Almost a fifth of 8 to 17-year-olds (18%) know someone their age who has lost money to an online scam.
  • Over a quarter (26%) of young people who've been scammed online blame themselves.
  • 74% of 8 to 17-year-olds want to learn more about how to spot scams online.

Almost half of 8 to 17-year-olds have been scammed online - UK Safer Internet Centre

Consultation on draft Guidance: A safer life online for women and girls (2)

Source: Ofcom published on this website Tuesday 4 March 2025 by Jill Powell

We are consulting on draft guidance, which sets out nine areas where technology firms should do more to improve women and girls’ online safety by taking responsibility, designing their services to prevent harm and supporting their users.

The Online Safety Act 2023 makes platforms – including social media, gaming services, dating apps, discussion forums and search services – legally responsible for protecting people in the UK from illegal content and content harmful to children, including harms that disproportionately affect women and girls. 

Ofcom has already published final Codes and risk assessment guidance on how we expect platforms to tackle illegal content, and we’ll shortly publish our final Codes and guidance on the protection of children. Once these duties come into force, Ofcom’s role will be to hold tech companies to account, using the full force of our enforcement powers where necessary.  

But Ofcom is also required to produce guidance setting out how providers can take action against harmful content and activity that disproportionately affects women and girls, in recognition of the unique risks they face. 

Our draft Guidance identifies a total of nine areas where technology firms should do more to improve women and girls’ online safety by taking responsibility, designing their services to prevent harm and supporting their users.

Responding to this consultation:

Please submit responses by using consultation-response-form (ODT, 98.85 KB) form by 5pm on 23 May 2025.

Main documents

Consultation Document A safer life online for women and girls • PDF • 1.14 MB • 25 February 2025

Annex A Draft Guidance • PDF • 1.95 MB • 25 February 2025

Guidance at a Glance • PDF • 355.35 KB • 25 February 2025

Trosolwg • PDF • 1.1 MB • 25 February 2025

If you have been affected by these harms, you can find support services here (Domestic Abuse Commissioner) and here (Victim and Witness Information) . If you’re worried someone might share your intimate images online or it has already happened to you, see StopNCII and the Revenge Porn Helpline.

Mencap responds to “Lost in the System; The Need for Better NHS Admin, from the Kings Fund”

Source: Mencap published on this website Thursday 20 February 2025 by Jill Powell

The report highlights the scale and impact of poor NHS administration and patient communications, including on people with long-term conditions. 

Jon Sparkes, OBE, Chief Executive at learning disability charity Mencap, says:

“It's deeply worrying to hear people are being deterred from getting medical treatment because of NHS admin issues and this can be particularly harmful for people with a learning disability, who already face huge inequalities in getting the healthcare they need.

“Having to chase test results, not knowing who to contact or how long you might have to wait for treatment is hard for anyone and it’s unsurprising that people end up with delays to care or treatment or feel like giving up. It is an even heavier burden for people with a learning disability who are struggling to navigate a complex health system.

“For people with a learning disability these barriers to healthcare form part of the picture where they die on average up to 23 years earlier than the general population with many of their deaths avoidable. Delays to accessing care and treatment are a key driver of these shocking inequalities.

“Poor admin isn't just an inconvenience it actively deepens health inequalities. The Government’s 10 Year NHS Plan must fix these preventable admin failures and ensure the healthcare system meets the needs of people with a learning disability. This includes ensuring they are supported to access easy-to-understand information and are able to contact NHS services in a more accessible way

2025 Appropriate filtering and monitoring definitions published for public consultation

Source: UK Safer Internet Centre (UKSIC) published on this website Monday 3 March 2025 by Jill Powell

The UK Safer Internet Centre (UKSIC) has published a draft of its ‘appropriate’ filtering and monitoring definitions for 2025 for public consultation. The definitions help both schools and providers understand what is considered ‘appropriate’ and comments are invited regarding this years proposed revisions.

Alongside the DfE’s introduction of statutory guidanceand Prevent Duty obligations, UKSIC first published its filtering and monitoring definitions in 2016 to help both schools and providers understand what should be considered as ‘appropriate’.  

Included here are final proposed revisions alongside a version that tracks the changes compared to 2024 with a summary of the substantive changes, both for Filtering and Monitoring

Age checks to protect children online

Source: Ofcom published on this website Wednesday 16 January 2025 by Jill Powell

Children will be prevented from encountering online pornography and protected from other types of harmful content under Ofcom’s new industry guidance which sets out how we expect sites and apps to introduce highly effective age assurance.

Today’s decisions are the next step in Ofcom implementing the Online Safety Act and creating a safer life online for people in the UK, particularly children. It follows tough industry standards, announced last month, to tackle illegal content online, and comes ahead of broader protection of children measures which will launch in the Spring.

Robust age checks are a cornerstone of the Online Safety Act. It requires services which allow pornography or certain other types of harmful content to introduce ‘age assurance’ to ensure that children are not normally able to encounter it. Age assurance methods – which include age verification, age estimation or a combination of both – must be ‘highly effective’ at correctly determining whether a particular user is a child.

We have today published industry guidance on how we expect age assurance to be implemented in practice for it to be considered highly effective. Our approach is designed to be flexible, tech-neutral and future-proof. It also allows space for innovation in age assurance, which represents an important part of a wider safety tech sector where the UK is a global leader[2]. We expect the approach to be applied consistently across all parts of the online safety regime over time.

While providing strong protections to children, our approach also takes care to ensure that privacy rights are protected and that adults can still access legal pornography. As platforms take action to introduce age assurance over the next six months, adults will start to notice changes in how they access certain online services. Our evidence suggests that the vast majority of adults (80%) are broadly supportive of age assurance measures to prevent children from encountering online pornography.

What are online services required to do, and by when?

The Online Safety Act divides online services into different categories with distinct routes to implement age checks. However, the action we expect all of them to take starts from today:

  • Requirement to carry out a children’s access assessment.  All user-to-user and search services – defined as ‘Part 3’ services – in scope of the Act, must carry out a children’s access assessment to establish if their service – or part of their service - is likely to be accessed by children. From today, these services have three months to complete their children’s access assessments, in line with our guidance, with a final deadline of 16 April. Unless they are already using highly effective age assurance and can evidence this, we anticipate that most of these services will need to conclude that they are likely to be accessed by children within the meaning of the Act. Services that fall into this category must comply with the children’s risk assessment duties and the children’s safety duties.
  • Measures to protect children on social media and other user-to-user servicesWe will publish our Protection of Children Codes and children’s risk assessment guidance in April 2025. This means that services that are likely to be accessed by children will need to conduct a children’s risk assessment by July 2025 – that is, within three months. Following this, they will need to implement measures to protect children on their services, in line with our Protection of Children Codes to address the risks of harm identified. These measures may include introducing age checks to determine which of their users are under-18 and protect them from harmful content.
  • Services that allow pornography must introduce processes to check the age of users: all services which allow pornography must have highly effective age assurance processes in place by July 2025 at the latest to protect children from encountering it. The Act imposes different deadlines on different types of providers. Services that publish their own pornographic content (defined as ‘Part 5 Services[6]) including certain Generative AI tools, must begin taking steps immediately to introduce robust age checks, in line with our published guidance. Services that allow user-generated pornographic content – which fall under ‘Part 3’ services – must have fully implemented age checks by July.

What does highly effective age assurance mean?

Our approach to highly effective age assurance and how we expect it to be implemented in practice applies consistently across three pieces of industry guidance, published today[5]. Our final position, in summary:

  • confirms that any age-checking methods deployed by services must be technically accurate, robust, reliable and fair in order to be considered highly effective;
  • sets out a non-exhaustive list of methods that we consider are capable of being highly effective. They include: open banking, photo ID matching, facial age estimation, mobile network operator age checks, credit card checks, digital identity services and email-based age estimation;
  • confirms that methods including self-declaration of age and online payments which don’t require a person to be 18 are not highly effective;
  • stipulates that pornographic content must not be visible to users before, or during, the process of completing an age check. Nor should services host or permit content that directs or encourages users to attempt to circumvent an age assurance process; and
  • sets expectations that sites and apps consider the interests of all users when implementing age assurance – affording strong protection to children, while taking care that privacy rights are respected and adults can still access legal pornography.

We consider this approach will secure the best outcomes for the protection of children online in the early years of the Act being in force. While we have decided not to introduce numerical thresholds for highly effective age assurance at this stage (e.g. 99% accuracy), we acknowledge that numerical thresholds may complement our four criteria in the future, pending further developments in testing methodologies, industry standards, and independent research.

Opening a new enforcement programme

We expect all services to take a proactive approach to compliance and meet their respective implementation deadlines. Today Ofcom is opening an age assurance enforcement programme, focusing our attention first on Part 5 services thatdisplay or publish their own pornographic content.

We will contact a range of adult services – large and small – to advise them of their new obligations. We will not hesitate to take action and launch investigations against services that do not engage or ultimately comply.