Services A-Z     Pricing

Online Safety – Year in review 2025

21 January 2026

This online safety update provides a summary of the significant changes to Ofcom’s implementation of the Online Safety Act 2023 and the regulator’s enforcement strategy over the last year, and their potential impact on tech companies.

 
While Ofcom has been implementing the Act in stages , we can expect the regulator to be increasingly ambitious over the coming months, by taking greater enforcement steps against social media and search services, who will continue to remain in the spotlight alongside adult/pornographic services .
 

1. New duties for platforms and risks to senior managers 

 
To implement the Online Safety Act (“OSA”) Ofcom prepared a multi-stage plan which came into effect throughout 2025. Platforms are now subject to new obligations around age verification, prevention of illegal harms and the mitigation of risks to children. To meet these duties, online service providers must carry out risk assessments for each of these areas and, depending on the nature of their service, may be required to submit the assessments to Ofcom for review.
 
Non-compliance carries significant financial risk. The primary sanction is a fine of up to £18 million or 10% of the service providers’ qualifying worldwide revenue (whichever is greater). Ofcom also has the power to seek a court order to impose business disruption measures creating further operational risks for platforms that fail to meet their obligations.
 
To enforce these new obligations, Ofcom has been granted new information gathering powers, including the power to issue information notices. These require platforms to supply the information necessary for Ofcom to evaluate compliance with its online safety duties. In December 2025, Ofcom imposed a fine of £20,000 on a file-sharing service that failed to comply with such a notice. Ofcom issued the file-sharing service with binding information requests given these types of platforms can be used for widespread distribution of child sexual abuse material (“CSAM”). 
 
Directors and senior managers must also be aware of their own potential exposure under the OSA. The Act provides  that  certain offences, such as threatening or false communications, can lead to personal criminal liability where they are committed by the body corporate and it is proved that the offence was committed with the consent or connivance of a corporate officer (i.e. a director, manager, associate, secretary or other similar officer), or attributable to any neglect on the part of a corporate officer .  In addition, senior managers could be held criminally liable if they fail to comply with an information notice or fail to take all reasonable steps to prevent such a failure from occurring.  
 
For further details on corporate criminal liability under the OSA please see our article here.
 

2. Ofcom’s enforcement drive

 
With its new guidance and the associated duties for corporates coming into force, Ofcom has adopted a proactive and assertive approach when it comes to enforcement. It launched its enforcement programme at the beginning of 2025, which is designed to review industry compliance with the OSA.
 
Throughout 2025, Ofcom launched numerous investigations into platforms for suspected non-compliance with age assurance and failure to ensure the protection against illegal harms and harms to children. In October 2025, Ofcom stated in its most recent update on OSA investigations that they had opened 21 investigations since March 2025, although since this update, they have announced several further investigations   into alleged breaches of the OSA. 
 
This follows a trend whereby Ofcom is taking further regulatory action against tech companies, specifically those at higher risk of perpetrating online harms. By way of example, in November 2025, the regulator opened investigations into 20 additional pornography sites and exercised its regulatory powers to fine the AI deepfake “nudification” site, Undress.cc, £50,000 for failing to implement age checks. Undress, which leverages generative AI to create nude images of people, is operated by Itai Tech Ltd. Reports indicate that Ofcom has imposed additional penalties on Itai Tech Ltd. for failing to comply with Ofcom’s statutory information request.
 
While current enforcement activity is focused on age verification measures, these robust steps illustrate that Ofcom is only beginning to exercise the full breadth of its OSA powers. There is significant political will behind the regulation of the UK tech sector, with the former Home Secretary Yvette Cooper stating that “AI is putting online child abuse on steroids” and senior leadership at Ofcom emphasising that age verification implementation to protect children from harmful content is “non-negotiable”. 
 
In December 2025, Ofcom published its new guidance   for tech companies aimed at tackling online harms against women and girls. The guidance focuses on, among many things, online misogynistic abuse and harassment, stalking, domestic abuse (including coercive control), and intimate image abuse (also referred to as image-based sexual abuse). Its stated aim is to ensure “a safer online experience for millions of women and girls in the UK”. Dating app and social media services are therefore likely to face heightened scrutiny and enforcement measures. Ofcom is expected to provide further guidance to tech companies on how to proactively protect their users from unsolicited sexual images and media.
 
Importantly, the guidance sets out examples of “good practice steps” that companies can adopt. Significantly, Ofcom has already written to a number of sites and app providers, making it clear that they are expected to “start to take immediate action in line with the guidance”.
 
For a deeper analysis of the new guidance, please see our previous article here.
 

3. The challenges and concerns raised about the OSA regime 

 
With the implementation of the new OSA obligations, criticism and challenges were raised, with many questioning the adequacy of the Act and its impact on free speech.
 
One of the most prominent challenges came from Wikipedia, which sought a judicial review of Ofcom’s categorisation framework. The framework would classify Wikipedia as a Category 1 service, subjecting it to enhanced duties, including requirements to collect additional data about its contributors. Although Wikipedia’s application was ultimately dismissed, the case highlights the continuing debate around the scope and proportionality of online safety regulation.
 
Data privacy concerns have also been widely raised prompting a rise in the use of VPNs to circumvent the OSA regulations. When the new rules came into force VPN providers saw an increase in downloads, with several providers claiming the top spot on app store downloads. ProtonVPM, for example, reported a 1,400% increase in sign-ups with other providers seeing similar increases. These virtual, private networks enable users to bypass the regulations, most notably age verification/assurance measures that platforms are required to implement in compliance with the Act.
 
The ease with which users can access such digital tools allowing users to circumvent the legislation raises questions about its overall effectiveness and the practical challenges Ofcom and regulated platforms face in ensuring compliance.
 

4. From “incels” to “rage baiting” : tackling emerging threats in male-dominated online spaces

 
With the growing attention on the risks to children, as well and women and girls, we also anticipate steps being taken by the Ofcom to address the specific risks to and from young boys. 
 
The huge popularity of the Netflix drama Adolescence has played a significant role in bringing the dangers and real-world consequences of the “manosphere” and its online influence into mainstream discussion. These online spaces, often frequented by young boys, promote misogynistic behaviour and harmful ideologies. As illustrated through the character of Jamie Miller, there is a heightened awareness of how such groups can influence boys’ views of the world.    Terms such as “incels”, “red pill” and “black pill” have become part of popular vocabulary. 
 
For a deeper dive into the themes raised in Adolescence and the role of criminal defence solicitors in the context of youth cases, please see our previous article here.
 
Ofcom and the NCA have each published studies highlighting the risks associated with the “manosphere”. The findings emphasise not only the risk of harm directed towards women and girls but also the vulnerability of the boys and young men drawn into these communities. Those that are socially isolated are particularly susceptible to the hierarchical structures and rhetoric within these groups, leaving them at risk of deeper immersion, worsening mental and physical health, and exposure to more extreme ideologies. Academic research reveals that only a small number of users in these online incel forums and groups regularly post extreme content, but these posts have the effect of radicalising the wider social network of young boys and vulnerable men.
 
The NCA has also launched a campaign to tackle sextortion among young teenage boys. There is a notable lack of understanding amongst young boys, with the NCA’s campaign showing that 74% of the boys questioned did not fully understand what sextortion was. Deemed an “emerging threat”   , the NCA reports that teenage boys are increasingly joining online groups to share extreme material.
 
In its 2025 National Strategic Assessment, the NCA identified online networks engaging in diverse online offences (“Com networks”) including grooming, blackmailing and threatening victims into carrying out extreme acts, such as sharing sexual material and self-harming. Vulnerable young victims are targeted and groomed online (for example through social media and gaming services) and controlled through manipulation tactics to extort imagery and cause harm. Europol’s Internet Organised Crime Threat Assessment (2024)  similarly found that “self-generated sexual material constitutes a significant share of the child sexual abuse material (CSAM) detected online”. These networks typically attract young men promoting nihilistic and misogynistic views, who attempt to gain status with other users by committing or encouraging harmful acts.
 

5. From scams to sextortion: how deepfakes are reshaping online threats

 
Deepfakes have increasingly become part of mainstream awareness, with the associated risks continuing to grow.
 
As part of the Criminal Justice Bill 24-26 an amendment was introduced aiming to prevent AI from being exploited to create deepfake CSAM. This would effectively make the creation of sexually explicit deepfakes illegal, in line with recommendations by experts and campaign groups within the violence against women and girls (VAWG) sector. Europol also notes that AI-generated CSAM is likely to become more prominent in the near future. AI-generated CSAM presents a significant challenge for law enforcement, as it becomes more difficult to distinguish real victims from synthetic subjects.
 
The proposed legislation aims to empower AI developers and child protection organisations such as IWF to test AI models for safeguards against generating CSAM, extreme pornography and non-consensual indecent images.
 
While public figures remain the main targets, we have also seen the increased use of deepfakes in fraudulent activities against companies and private individuals. More stories about romance fraud (or “pig butchering” scams) have been publicised, whereby fraudsters identify victims using dating apps and cultivate relationships with their victims over time, leveraging deepfake videos via remote calls/video conferencing platforms and, eventually, extorting money from them. With this new generation of “catfishing” becoming much harder to identify, it falls to users to become more wary and technologically savvy, keeping up to date with the latest fraudulent tactics and trends as part of their day-to-day online activities. Following its review into romance fraud in October 2025, the FCA called upon banks to take greater action on preventing romance fraud, revealing that it  was responsible for losses of £106 million for the financial year 2024/2025.
 
For more detail on the latest romance fraud tactics, please see our previous article here.
 
We have also seen increased risks to corporates, not only through fraudulent financial transactions, but also disinformation. This brings in a new age of identity fraud where individuals can emulate image, video and audio material to fully convince individuals to carry out intended activities. UK regulators and cybersecurity agencies have issued several warnings about deepfake-enabled fraud in financial transactions and executive impersonation.
 

6. New priority offences and wider enforcement action

The regulator may be increasingly ambitious over the coming months in respect of its enforcement steps against tech companies. Notably, Ofcom has stated its intention to strengthen industry codes to reflect cyberflashing becoming a priority offence in 2026,  an offence which is most often perpetrated via social media platforms, dating apps and file-sharing services. Online content depicting non-fatal strangulation or suffocation in pornography is also set to be designated a priority offence under the Act, with such depictions to become illegal content, by way of an amendment to the Crime and Policing Bill.
 
As we have seen following Ofcom’s investigation into Grok, the AI chatbot on X, reported by users as being used to create non-consensual sexually explicit deepfakes of adults and deepfake CSAM, the regulator is prepared to take swift action against platforms and to use the full suite of enforcement tools available. This month, Ofcom launched its formal investigation into Grok AI, requiring X to outline the steps it had taken to protect UK users and ensure compliance with the OSA.  
 

What’s next in online safety regulation?

The year 2025 has seen Ofcom take significant steps in relation to online safety regulation. While the OSA’s implementation has received some criticism from the tech industry, many acknowledge the new regime marks a positive step in the journey toward protecting online users.
 
While recent enforcement actions have targeted the adult sector and platforms producing predominantly pornographic content, Ofcom’s actions illustrate a strong intention to police platforms more broadly on online harms and illegal content, and to use the full extent of its regulatory powers, including significant financial sanctions.
 
Tech companies can therefore expect to be increasingly required to take proactive measures in detecting and removing such illegal material online, and before such material reaches service users.

 

If you have any questions regarding this blog, please contact Nicola Finnerty or Alice Trotter in our Criminal Litigation team.

About the authors 

Nicola is a leading defence lawyer specialising in high profile and complex Government enforcement cases, proceeds of crime, white collar crime, fraud, asset forfeiture, investigations and AML in the UK and internationally.

Alice is an Associate in the Criminal Litigation team. Alice’s practice includes all areas of criminal litigation, with particular expertise in Online Safety, serious and general crime, and white-collar crime.  She represents individuals and corporate clients from the initial stages of an investigation through to trial.

Isabella is a trainee solicitor at Kingsley Napley and is currently in her third seat with the Criminal Litigation team.

Latest blogs & news

Online Safety – Year in review 2025

This online safety update provides a summary of the significant changes to Ofcom’s implementation of the Online Safety Act 2023 and the regulator’s enforcement strategy over the last year, and their potential impact on tech companies.

From Certificates to Belief Statements: The CPS and the Limits of Forum Bar Intervention

The CPS’s June 2025 guidance on the forum bar marks a decisive narrowing of the circumstances in which prosecutor’s belief statements may be issued. Such statements (by which a domestic prosecutor expresses the view that the UK is not the most appropriate jurisdiction for prosecution) have often featured in litigation under sections 19B and 83A of the Extradition Act 2003.

Focusing on Prosecuting Corporates: joint SFO – CPS Guidance released

On 18 August 2025, the Serious Fraud Office (SFO) and Crown Prosecution Service (CPS) published their Joint SFO-CPS Corporate Prosecution Guidance, intended for prosecutors who will make decisions about whether or not to prosecute a corporation.

Preparing for changes to non-disclosure agreements from 1 October 2025

In June the Ministry of Justice announced new legislation under the Victims and Prisoners Act 2024 which affects NDAs and confidentiality clauses.* Related guidance, published at the beginning of June, sets out the impact of this legislation on the enforceability of such agreements.

Why the Leveson Review Is Significant For UK Court System

The  Leveson review has been billed as a once-in-a-lifetime opportunity to reform the court system, with 45 recommendations being presented to Lord Chancellor Shabana Mahmood.

OfS Condition E6: a first step towards a unified approach to harassment and sexual misconduct, but does it go far enough?

In July 2024 the Office for Students (OfS) published guidance on a new condition of registration dealing specifically with harassment and sexual misconduct. That condition, ‘E6’, comes into force on 1 August 2025. As such, universities and colleges have had a year to ensure they comply.

The Insolvency Service: Repackaging Old Strategies for New Successes with Major Partner

On 16 July 2025, the Insolvency Service released its new five-year strategy towards tackling economic crime facilitated by companies to be implemented between 2026-2031. Despite an enthusiastic introduction to its plans as ‘ambitious’ and ‘transformational’, the four strategic pillars laid out in the strategy brief – to target more cases involving corporate structures and serious criminality; exploit emerging technology; collaborate closely with public and private sector partners; and recruit, retain and invest in its workforce – echo the agency’s existing commitments, as well as the aims of recently released strategies by adjacent organisations like the FCA, NECC and CPS.

New Child Safety Duties Under the Online Safety Act: What Online Platforms Must Know

As of 25 July 2025, new child safety duties under the Online Safety Act have come into force, requiring online platforms to implement robust safety measures to prevent children from accessing illegal or harmful content. The consequences for non-compliance are significant, making it essential for online providers to understand their new obligations.

A System Under Strain: Why It's Time to Rethink the UK’s Approach to Extradition and International Cooperation

As global crime evolves and political landscapes shift, the UK’s legal frameworks for international cooperation and extradition are showing their age. In a new blog, Rebecca Niblock explores the  Criminal Law Reform Now Network (CLRNN) Scoping Review   (June 2025) which makes a compelling case: the time for reform is now.

Modernising Cartel Enforcement: CMA launches consultation on updated leniency guidance

On 29 April 2025, the UK’s Competition and Markets Authority (CMA) published a consultation on proposed revisions to its leniency guidance for cartel cases. The changes are intended to reflect legislative changes, align with current enforcement practices, and enhance the clarity, accessibility, and effectiveness of the CMA’s leniency regime.

New UK crypto regime takes a step closer

HM Treasury has published a draft statutory instrument which, when brought into force, will introduce a new regulatory regime for cryptoassets in the UK.

Five things to know about criminal risk in M&A transactions

Criminal risk isn’t the first thing that comes to mind when considering the commercial drivers behind a merger or acquisition. But our recent roundtable discussion at our offices made clear that criminal liability—however peripheral it might seem—can have very real consequences for deal viability and post-completion exposure. Here are five key takeaways from a discussion that brought together legal and business perspectives on how economic crime intersects with transactional work.

A tizzy over fizzy: how the Coca-Cola Company, and others, became recent targets of corporate “greenwashing” allegations

Whilst historically, climate-related litigation has been focused on governments, a report published last year by the Grantham Research Institute on Climate Change and the Environment showcased how, in recent years, climate litigation is being initiated more frequently against corporations for alleged Environment, Social and Governance (“ESG”) failings

Adolescence is brilliant TV but Jamie should have sacked his brief

The new Netflix drama Adolescence has propelled many themes to the forefront of our national conversations in the last week. With the corrosive effect of social media on our children being the most important, it is hardly surprising that the realism of the portrayal of the criminal justice system in the series has been somewhat overlooked.

Adolescence: The ordinary family’s worst nightmare

As we await the release of the Netflix series Adolescence this evening by award winning writer Jack Thorne, I am interested to see how the series will deal with very real, yet often publicly unheard problems of how our criminal justice system, in particular the police, manage children who are alleged to have committed serious offences.

Is the FCA’s name and shame policy now dead in the water?

On 6 February the House of Lords Financial Services Regulation Committee published its response to the latest iteration of the FCA’s proposals to “name and shame” firms under investigation by the regulator.

The implementation of the Online Safety Act: understanding Ofcom’s new requirements

Following the enactment of the Online Safety Act (“OSA”) in October 2023, Ofcom has prepared a multi-stage plan for its implementation. Under this legislation, online service providers are subject to a number of new obligations, and Ofcom has a duty to ensure compliance with these requirements. 

SFO Unexplained Wealth Orders – new focus for illicit finance?

On 17 January, the Serious Fraud Office (SFO) secured its first Unexplained Wealth Order, in respect of a property believed to have been purchased with the proceeds of a £100 million fraud. 

Sir Brian Leveson’s review of the courts

Whatever its cause, a backlog of over 73,000 Crown court cases is not acceptable. Delays for complainants, defendants and witnesses all impede justice. In the third quarter of 2024, the Crown court received over 31,683 new cases and disposed of 29,502. The passage of time will not solve the problem. Change is inevitable. 

Increased Funding for INTERPOL’s CCF: Will it Solve the Delay Crisis?

A recent update on INTERPOL’s website is unlikely to raise eyebrows. The Commission for the Control of INTERPOL’s Files (CCF) has acknowledged that it has been experiencing delays in meeting its deadlines due to an increased workload, both within the Commission and among other INTERPOL stakeholders. This will be all too familiar to those targeted by red notices and their representatives. Resourcing issues and delays have long plagued the CCF, despite operational rules requiring decisions on disclosure requests within four months and deletion requests within nine months.

Skip to content Home About Us Insights Services Contact Accessibility