Services A-Z     Pricing

Ofcom’s new draft guidance for ‘a safer life online for women and girls’ as part of its OSA consultation process

15 April 2025

The Office of Communications, commonly known as ‘Ofcom’ (the regulator for communication services) is calling on tech firms to make ‘the online world safer for women and girls’.

When the Online Safety Act (the “OSA”) was passed in late 2023, Ofcom was granted broad powers to regulate online service providers. The OSA established a new regulatory regime, imposing various legal requirements on providers of online services to protect all people in the UK from illegal and harmful content accessed online, including social media platforms, blogging platforms and gaming sites. The OSA imposes legal requirements on regulated user-to-user services, search services and pornographic services to keep users safe online. Please see our previous article here , for more detailed information on the OSA.

As part of this regulatory regime under the OSA, Ofcom has initiated a new consultation on its draft guidance (“Guidance”). The Guidance sets out a broad suite of measures online service providers should take to not only protect women and girls from accessing illegal and harmful content, but to take active steps against harmful content and activity that disproportionally affects women and girls. The draft Guidance is to be read in conjunction with and in addition to Ofcom’s already published final codes and risk assessment guidance on how it expects online service providers to take action to prevent harmful content. Under the OSA, online service providers have a general duty to protect all users from illegal harm, what this Guidance does is recognise ‘the unique risks’ that women and girls face online, requiring online service providers to take this duty one step further, calling for ambitious action and protection to improve women’s and girls’ safety, such as in relation to online domestic abuse and online misogyny. Ofcom has made it clear in its Guidance that its role will be to hold tech companies to account, specifically pointing out in its Guidance that it has drawn on practical, real examples to show how tech companies can do more, and are expected to do more.

The Guidance focuses on the following key areas of ‘harm’ for online service providers to focus on:

  1. Online misogyny – this type harm relates to content that actively encourages misogynistic ideas or behaviours. This type of harm is defined to cover both illegal content, such as illegal threats, but also content which is legal but harmful to children, such as content normalising gendered or sexual violence.
  2. Pile-ons and online harassment – this type of harm relates to content targeted against a specific woman or girl, or groups of women or girls. Similar to the above, this type of harm is defined to span across both illegal content, such as harassment, as well as legal content which is harmful to children, such as misogynistic abuse.
  3. Online domestic abuse – this type of harm relates to coercive and controlling behaviours in the context of an intimate relationship.
  4. Image-based sexual abuse – this type of harm relates to the abuse of and non-consensual sharing of intimate images and cyberflashing (sending explicit images to someone without their consent).

Notably, it is evident that Ofcom is seeking to cover a wide scope of potential harm, to reflect the wide spectrum of content and activity, and the ever changing and challenging nature of effectively tackling the safe use of online platforms by users.

Ofcom’s draft Guidance sets out nine areas where online service providers can contribute towards improving women and girls’ online safety to tackle the above key areas of harm (amongst other types of harm) under the following three ‘categories’: (1) by taking responsibility; (2) designing their services to prevent harm; and (3) supporting their users. Following a ‘safety by design’ approach, Ofcom expects online services providers to take the following actions:

  1. Ensure that their governance and accountability processes address online gender-based harm. Interestingly, Ofcom expects firms to not only change the way they seek to prevent harmful content being accessed online from a technical perspective on their platforms but is also demanding that firms take a top-down approach to management, to ensure a culture and responsibility shift occurs in leadership’s overall decision making processes in respect of the design and operation of services and platforms. Good practice includes firms taking steps to: (a) set online gender-based harm policies; (b) consult with subject matter experts, in particular when drafting a platform’s terms & conditions with its users; and (c) training staff involved in setting such policies and decision making around safety by design approaches.
  2. Conduct risk assessments that focus on harm to women and girls. Ofcom has included foundational steps related to firms risk assessment duties, and which are set out in their ‘Illegal content risk assessment guidance’ and ‘Draft children’s risk assessment guidance’ which aims to help firms comply with their illegal content risk assessment duties. Firms are required to really seek to understand user behaviours and women and girls’ experiences such as by using external assessors to monitor emerging threats and conducting user surveys to better understand user’s preferences and experiences to inform and implement design that effectively solves issues raised.
  3. Be transparent about women and girls’ online safety. Good practice includes sharing information about which posts are flagged (or not flagged) by automated content moderation.
  4. Conduct abusability evaluations and product testing. Firms are encouraged to pre-empt how something in a platform could be misused, such as by improving content filters, updating blocklists and removing nudity content in its training datasets.
  5. To set safer defaults. This includes building functionality that sets stronger platform defaults as well as more customermisable user defaults around interactions, privacy and geolocation. For example, setting a platform default that automatically removes metadata from images that are uploaded onto a platform, or providing a user the option to share location for a specified time.
  6. Reduce the circulation of online gender-based harms. Ofcom has taken a practical approach in its recommendation to firms, recognising that there is no ‘one size fits all approach’ in terms of preventing and reducing the spread of online gender-based harm while balancing user’s rights. Firms will need to carefully consider what mixture of approaches will be most effective via design, such as by implementing nudges (to promote certain behaviour and/or discourage other behaviour) and carefully assessing automated processes so that these processes effectively promote users experience.
  7. Giving users better control of their own experiences. Firms are encouraged to give users more control while using online services, such as allowing users to delete or change the visibility settings of any content they upload, mechanisms to easily block and mute multiple accounts simultaneously or filter what content is recommended to them by automated processes.
  8. Enabling users who experience online gender-based harms to make reports. Good practice for firms includes designing a platform from a trauma-based approach, such as by designing an effective reporting and flagging function.
  9. Taking appropriate action when online gender-based harms occur. Ofcom encourages firms to take more pro-active enforcement action against users who consistently breach their terms of service, such as using a ‘strike based policy’, restricting certain parts of the service or an outright ban.

If it has not already been clear from Ofcom’s existing guidance, this Guidance cements the tone for the ‘call to action’ online services providers are expected to take by Ofcom in terms of demonstrating their compliance with the OSA an Ofcom’s expectations in respect thereof. While Ofcom does not expect all online service providers to need to implement all foundational steps and good practice steps, as not all actions will be appropriate for low risk platforms and services.

However, online service providers will need to carefully evaluate not just the operation, design and functionality of its services and platforms, but also consider its leadership’s ultimate decision making processes in arriving at the final solutions that are implemented. Perhaps taking a ‘comply or explain’ approach to governance, design and implementation (such as via risk assessments) may strike the right balance between following the Guidance and explaining how a chosen solution, functionality or process is deemed appropriate and upholds the high standards of the OSA. Firms will also need to carefully consider the use of and processing of personal information and privacy rights with their duties under the UK General Data Protection Regulation against their legal requirements under the OSA. For example, effective automated content moderation may involve the processing of personal data. For any advice or assistance in assessing compliance with the OSA, related guidance and personal data protection, please get in touch. The consultation ends on 23 May 2025.

Further information

If you have any questions regarding this blog, please contact Caroline Sheldon in our Corporate, Commercial & Finance team.

About the author

Caroline Sheldon joined the Corporate, Commercial & Finance team in August 2022 as an associate and specialises in advising on commercial matters. She advises entrepreneurs, startups and established businesses across a variety of sectors, with a focus on those in the technology sector.

Share insightLinkedIn X Facebook Email to a friend Print

Email this page to a friend

We welcome views and opinions about the issues raised in this blog. Should you require specific advice in relation to personal circumstances, please use the form on the contact page.

Leave a comment

You may also be interested in:

Skip to content Home About Us Insights Services Contact Accessibility