Blog
5 Reasons Why Fundraising can Go Wrong
James Fulforth
The Office of Communications, commonly known as ‘Ofcom’ (the regulator for communication services) is calling on tech firms to make ‘the online world safer for women and girls’.
When the Online Safety Act (the “OSA”) was passed in late 2023, Ofcom was granted broad powers to regulate online service providers. The OSA established a new regulatory regime, imposing various legal requirements on providers of online services to protect all people in the UK from illegal and harmful content accessed online, including social media platforms, blogging platforms and gaming sites. The OSA imposes legal requirements on regulated user-to-user services, search services and pornographic services to keep users safe online. Please see our previous article here , for more detailed information on the OSA.
As part of this regulatory regime under the OSA, Ofcom has initiated a new consultation on its draft guidance (“Guidance”). The Guidance sets out a broad suite of measures online service providers should take to not only protect women and girls from accessing illegal and harmful content, but to take active steps against harmful content and activity that disproportionally affects women and girls. The draft Guidance is to be read in conjunction with and in addition to Ofcom’s already published final codes and risk assessment guidance on how it expects online service providers to take action to prevent harmful content. Under the OSA, online service providers have a general duty to protect all users from illegal harm, what this Guidance does is recognise ‘the unique risks’ that women and girls face online, requiring online service providers to take this duty one step further, calling for ambitious action and protection to improve women’s and girls’ safety, such as in relation to online domestic abuse and online misogyny. Ofcom has made it clear in its Guidance that its role will be to hold tech companies to account, specifically pointing out in its Guidance that it has drawn on practical, real examples to show how tech companies can do more, and are expected to do more.
The Guidance focuses on the following key areas of ‘harm’ for online service providers to focus on:
Notably, it is evident that Ofcom is seeking to cover a wide scope of potential harm, to reflect the wide spectrum of content and activity, and the ever changing and challenging nature of effectively tackling the safe use of online platforms by users.
Ofcom’s draft Guidance sets out nine areas where online service providers can contribute towards improving women and girls’ online safety to tackle the above key areas of harm (amongst other types of harm) under the following three ‘categories’: (1) by taking responsibility; (2) designing their services to prevent harm; and (3) supporting their users. Following a ‘safety by design’ approach, Ofcom expects online services providers to take the following actions:
If it has not already been clear from Ofcom’s existing guidance, this Guidance cements the tone for the ‘call to action’ online services providers are expected to take by Ofcom in terms of demonstrating their compliance with the OSA an Ofcom’s expectations in respect thereof. While Ofcom does not expect all online service providers to need to implement all foundational steps and good practice steps, as not all actions will be appropriate for low risk platforms and services.
However, online service providers will need to carefully evaluate not just the operation, design and functionality of its services and platforms, but also consider its leadership’s ultimate decision making processes in arriving at the final solutions that are implemented. Perhaps taking a ‘comply or explain’ approach to governance, design and implementation (such as via risk assessments) may strike the right balance between following the Guidance and explaining how a chosen solution, functionality or process is deemed appropriate and upholds the high standards of the OSA. Firms will also need to carefully consider the use of and processing of personal information and privacy rights with their duties under the UK General Data Protection Regulation against their legal requirements under the OSA. For example, effective automated content moderation may involve the processing of personal data. For any advice or assistance in assessing compliance with the OSA, related guidance and personal data protection, please get in touch. The consultation ends on 23 May 2025.
If you have any questions regarding this blog, please contact Caroline Sheldon in our Corporate, Commercial & Finance team.
Caroline Sheldon joined the Corporate, Commercial & Finance team in August 2022 as an associate and specialises in advising on commercial matters. She advises entrepreneurs, startups and established businesses across a variety of sectors, with a focus on those in the technology sector.
We welcome views and opinions about the issues raised in this blog. Should you require specific advice in relation to personal circumstances, please use the form on the contact page.
James Fulforth
Christopher Perrin
Christopher Perrin
Skip to content Home About Us Insights Services Contact Accessibility
Share insightLinkedIn X Facebook Email to a friend Print