Blog
From Certificates to Belief Statements: The CPS and the Limits of Forum Bar Intervention
Rebecca Niblock
Artificial intelligence, and its use on social media, is making it continuously harder to distinguish between real and fake information online. Although fact checking is often required when considering written or spoken words, with the advent of so-called “deepfakes”, we now also need to fact check some of the images or videos we see online.
With the enactment of the Online Safety Act 2023 (OSA), steps have been taken to try to protect online users against the nefarious uses of this technological advancement, particularly when it comes to sexually explicit images. However, it is clear that more needs to be done. With technology and social media being used by most on a daily basis, there has been an increase in the use of sexually explicit images as a means to threaten and cause harm. Once an image is posted on the internet, it is almost impossible to entirely delete it and, these images can now be created by complete strangers with nothing more than a computer programme.
In order to fill the gaps in regulation, the government has recently announced the introduction of further legislation to fight against the creation of sexually explicit deepfakes.
What is a deepfake?
If we consider the numerous images that have gone viral online over the past couple of years, what do Rihanna and Katy Perry attending the 2024 MET Gala and Pope Francis wearing a Balenciaga puffer jacket have in common? The images went viral on social media, and all were fake.
These images qualified as deepfakes because they were constructed using “deep learning” artificial intelligence, which digitally alters an image, video or other piece of media to create a fiction rather than the images simply being edited. This technology has significantly advanced over recent years and has become much more accessible. This form of AI is now so advanced that it can mimic a person’s facial features to such a degree that it could fool that person’s own parent as happened with the viral image of Katy Perry attending the MET gala.
The legislation of sexually explicit deepfakes
Not only are deepfakes being used to impersonate people such as politicians or world leaders, thus deliberately misleading the population, there has also been an increase in the number of sexually explicit deepfakes being created and shared online. This deeply unsettling trend is extremely violating for the victims, not only with regard to the content of the images, but also when considering the lack of control they have over how the images will subsequently be used or who they will be shared with once uploaded online. For example in January 2024 for a whole day deepfake pornographic images of Taylor Swift were rapidly spread throughout the social media platform X (formerly known as Twitter). One particular image gained 47 million views, and although the platform ultimately took the deepfakes down, the damage had already been done.
This incident is not an isolated one and many online users are affected by deepfakes. There have been measures in place for a number of years to tackle what is often referred to as “revenge porn”. Previously, under section 33 of the Criminal Justice and Courts Act 2015 it was an offence to share private sexual images online without the other person’s consent, with the intention to cause alarm, distress or humiliation to the other person. The OSA replaced this offence with s66B of the Sexual Offences Act 2003 as set out in our previous blog, this legislation was recently amended by section 188 of the OSA (which added a new section 66B to the Sexual Offences Act 2003).
Under s66B it is now an offence to “intentionally share or threaten to share a photograph or film which shows, or appears to show another person in an intimate state”. This amendment removed the requirement of an intention to cause alarm, distress or humiliation in the previous legislation, and offers much wider protection to complainants.
The OSA also includes in its definition of photographs or film “an image whether made or altered by computer graphics or in any other way, which appears to be a photograph or film”. This is a broad definition, which enables the inclusion of images created through artificial intelligence, namely deepfakes.
Under this amendment, an offender could receive a sentence of six months in prison on summary conviction, or, on indictment, up to two years if it is proven that they also intended to cause distress, alarm or humiliation, or shared the image to obtain sexual gratification.
Alongside this amendment, section 187 of the OSA has also brought in an offence of sending photographs or films of genitals to another person (s66A of the Sexual Offences Act 2003), i.e. “cyberflashing”. This section in particular was enacted in the hope of offering additional protection to online users against receiving unwanted sexually explicit images.
Recent legislative progress
Building on these recent changes and the increase in the number of viral deepfake images, the government announced on 16 April 2024 an addition to these changes, whereby it intends to enact legislation (through the Criminal Justice Bill) making it an offence to create sexually explicit deepfakes without consent. This law was announced as an amendment to the upcoming Criminal Justice Bill, which was making its way through Parliament until the General Election was called.
Specifically, under these changes, individuals creating or designing intimate images of another person "using computer graphics or any other digital technology" for the purposes of causing alarm, distress or humiliation to the person, may face prosecution, and if convicted an unlimited fine and a criminal record.
This means that, under this new legislation (if it is eventually revived and passes in the same form), even if an individual creates a sexually explicit deepfake with no intent to share it, but purely to cause alarm, humiliation or distress to the victim, they will be committing a criminal offence. If the image is then shared more widely, the CPS could choose also to prosecute under section 66B of the Sexual Offences Act 2003 for “sharing or threatening to share intimate photographs or films”, which carries a sentence of up to two years in custody.
Building on some of the other offences, such as “upskirting”, the government is also seeking to incorporate into the Criminal Justice Bill new criminal offences surrounding taking or recording intimate images without consent, or installing equipment to enable someone to do so. If this legislation is adopted, it will be interesting to see how far the courts will be willing to take its application. One example could be whether criminal responsibility will extend to those who provide or install the devices used to create this illegal content, particularly if it could be reasonably believed that they would be used in this manner. It is too soon to say, but is certainly something individuals and companies should keep in mind.
There has been a clear shift towards addressing the need to protect online users, and particularly those falling victim to deepfakes. Regardless of the intentions behind them, deepfake creators will need to remain conscious of the dangers in creating these images, particularly if they are sexually explicit.
The difficulty remains, as with any legislation surrounding AI, whether lawmakers will be able to keep up with the technology. In theory, the use of broader definitions surrounding “computer graphics” or “any other digital technology”, will buy them the time they need to bring laws such as these into force, but lawmakers and practitioners certainly need to stay on their virtual and/or physical toes.
Finally, it also remains to be seen what approach will be taken after the General Election and whether the new government will encourage the revival of the Criminal Justice Bill, and the adoption of the proposed amendment relating to deepfakes.
If you have any questions regarding this blog, please contact Nicola Finnerty or Alice Trotter in our Criminal team.
Nicola Finnerty is a leading criminal defence expert in white collar and business crime, general criminal defence, and proceeds of crime and asset forfeiture.
Alice Trotter is an Associate in the Criminal Litigation team.
We welcome views and opinions about the issues raised in this blog. Should you require specific advice in relation to personal circumstances, please use the form on the contact page.
Rebecca Niblock
Alun Milford
Louise Hodges
Skip to content Home About Us Insights Services Contact Accessibility
Share insightLinkedIn X Facebook Email to a friend Print