SAFE
CIC
The Safeguarding Specialists
01379 871091

Tech firms must up their game to tackle online harms against women and girls

Source: Ofcom published on this website Tuesday 25 November 2025 by Jill Powell

Online safety watchdog, Ofcom, today launches new industry guidance demanding that tech firms step up to deliver a safer online experience for millions of women and girls in the UK.

Women and girls face distinct and serious risks online, including misogynistic abuse, sexual violence, coordinated pile-ons, stalking, coercive control and intimate image abuse.

Ofcom’s guidance, published today, includes a wide range of practical safety measures that the regulator is urging tech firms to adopt to tackle these harms. These go above and beyond what is needed to comply with their legal duties under the Online Safety Act, setting a new and ambitious standard for women’s and girls’ online safety. 

The guidance was developed with insights from victims, survivors, safety experts, women’s advocacy groups and organisations working with men and boys. Its launch is also supported by Sport England as part of their wider This Girl Can campaign, and WSL Football to raise awareness of women’s safety when taking part in sport and exercise.

Ofcom has today written to sites and apps setting an expectation that they start to take immediate action in line with the guidance. We will also publish a future report to reveal how individual companies respond.

Being a woman online

Women from all walks of life and backgrounds face disproportionate levels of harm online, which have wide-reaching effects. They limit women and girls’ ability to safely take part in online spaces, freely express themselves online, or even to do their jobs. They also contribute to the spread and normalisation of harmful attitudes towards women and girls.

Ofcom’s practical guidance, supported by case-study examples, sets out where tech companies can and should do more, while taking account of important human rights including freedom of expression and privacy. Focusing on the following four main areas of harm, our guidance makes clear how we expect services to design and test their services with safety in mind, and improve their reporting tools and support systems to better protect women and girls:

Misogynistic abuse and sexual violence. This includes content that spreads hate or violence against women, or normalises sexual violence, including some types of pornography. It can be both illegal or harmful to children and is often pushed by algorithms towards young men and boys. Under our guidance, tech firms should consider:

  • introducing ‘prompts’ asking users to reconsider before posting harmful content;
  • imposing ‘timeouts’ for users who repeatedly attempt to abuse a platform or functionality to target victims;
  • promoting diverse content and perspectives through their recommender ‘for you’ systems to help prevent toxic echo chambers; and
  • de-monetising posts or videos which promote misogynistic abuse and sexual violence.

Pile-ons and coordinated harassment. This happens when groups gang up to target a specific woman or group of women with abuse, threats, or hate. Such content may be illegal or harmful to children and often affects women in public life. Under our guidance, tech firms should consider:

  • setting volume limits on posts (“rate limiting”) to help prevent mass-posting of abuse in pile-ons;
  • allowing users to quickly block or mute multiple accounts at once; and
  • introducing more sophisticated tools for users to make multiple reports and track their progress.

Stalking and coercive control. This covers criminal offences where a perpetrator uses technology to stalk an individual or control a partner or family member. Under our guidance, tech firms should consider:

  • ‘bundling’ safety features to make it easier to set accounts to private;
  • introducing enhanced visibility restrictions to control who can see past and present content;
  • ensuring stronger account security; and
  • remove geolocationby default.

Image-based sexual abuse. This refers to criminal offences involving the non-consensual sharing of intimate images and cyberflashing. Under our guidance, tech firms should consider:

  • using automated technology known as ‘hash-matching’ to detect and remove non-consensual intimate images;
  • blurring nudity, giving adults the option to override;
  • signposting users to supportive information including how to report a potential crime.

More broadly, we expect tech firms to subject new services or features to ‘abusability’ testing before they roll them out, to identify from the outset how they might be misused by perpetrators. Moderation teams should also receive specialised training on online gender-based harms.

Companies are expected to consult with experts to design policies and safety features that work effectively for women and girls, while continually listening and learning from survivors’ and victims’ real-life experiences - for example through user surveys.

What happens now?

Ofcom is setting out a five-point action plan to drive change and hold tech firms to account in creating a safer life online for women and girls. We will:

  1. Enforce services’ legal requirements under the Online Safety Act
    We’ll continue to use the full extent of our powers to ensure platforms meet their duties in tackling illegal content, such as intimate image abuse or material which encourages unlawful hate and violence.
  2. Strengthen our industry Codes
    As changes to the law are made, we will further strengthen our illegal harms industry Codes measures. We’re already consulting on measures requiring the use of hash-matching technology to detect intimate image abuse and our Codes will also be updated to reflect cyberflashing becoming a priority offence, next year.
  3. Drive change through close supervision.
    We have today written an open letter to tech firms as the first step in a period of engagement to ensure they take practical action in response to our guidance. We plan to meet with companies in the coming months to underline our expectations and will convene an industry roundtable in 2026.
  4. Publicly report on industry progress to reduce gender-based harms
    We’ll report in summer 2027 on progress made by individual providers, and the industry as a whole, in reducing online harms to women and girls. If their action falls short, we will consider making formal recommendations to Government on where the Online Safety Act may need to be strengthened.
  5. Champion lived experience
    The voices of victims, survivors and the expert organisations which support them will remain at the heart of our work in this area. We will continue listen to their experiences and needs through our ongoing research and engagement programme.