The UK will criminalize the creation of sexually explicit deepfake images under a new law that aims to protect people against non-consensual AI-generated pornographic material.
Under the law, those who create sexually explicit deepfake images of another person without their consent will face prosecution and an unlimited fine, even if they did not intend to share the material.
If the image is then shared more widely, offenders could be sent to jail, the United Kingdom’s Ministry of Justice (MoJ) said in a statement on Tuesday.
Also read: Camera Strips People Naked to Spark Debate on AI and Privacy
AI deepfakes: ‘Immoral, despicable’
As artificial intelligence gets more advanced, the technology has been used by some people to create deepfakes – realistic but fake images used to impersonate someone else, including their voice.
The images or videos look and talk just like the person targeted. Pictures of high-profile female celebrities such as Emma Watson and Taylor Swift, have been doctored using AI to create deepfake pornographic content. Under-age school girls haven’t been spared either.
“The creation of deepfake sexual images is despicable and completely unacceptable irrespective of whether the image is shared,” UK Minister for Victims and Safeguarding said in a statement.
“It is another example of ways in which certain people seek to degrade and dehumanise others – especially women. And it has the capacity to cause catastrophic consequences if the material is shared more widely. This Government will not tolerate it,” she stated, adding:
“This new offense sends a crystal clear message that making this material is immoral, often misogynistic, and a crime.”
Sharing of ‘intimate’ deepfakes was already made illegal in the UK under the Online Safety Act, passed in 2023. The new deepfake pornography offense will be introduced as an amendment to the Criminal Justice Bill, which is making its way though Parliament.
The MoJ said its new law will make it an offense for someone to create a sexually explicit deepfake, even if they have no intention to share it, “but purely want to cause alarm, humiliation or distress to the victim.” It will apply to images of adults because the law already covers similar behavior for children under 18.
Malicious online deepfakes can have a real-world impact, as @MissCallyJane knows.
That’s why we’re cracking down on those who create sexually explicit deepfake images without consent – with offenders facing a criminal record and an unlimited fine.
More: https://t.co/TBX7z0Epsu pic.twitter.com/dhMv7eww7v
— Ministry of Justice (@MoJGovUK) April 16, 2024
Victims welcome new law
In the same MoJ statement, Cally Jane Beech, a campaigner and former Love Island contestant who was a victim of AI deepfake pornographic images earlier this year, said the law is important to protect women.
“This new offence is a huge step in further strengthening of the laws around deepfakes to better protect women. What I endured went beyond embarrassment or inconvenience,” she said.
“Too many women continue to have their privacy, dignity, and identity compromised by malicious individuals in this way and it has to stop. People who do this need to be held accountable,” Beech added.
A recent survey by Glamour found that over 90% of the magazine’s readers believe deepfake technology poses a threat to the safety of women, and from hearing personal stories from victims.
“While this is an important first step, there is still a long way to go before women will truly feel safe from this horrendous activity,” said Deborah Joseph, European editorial director of Glamour.
The UK government, which considers violence against women as a national threat, is also introducing new criminal offenses for people who take or record real intimate images without consent, or install equipment to enable someone to do so.
A new statutory aggravating factor will be brought in for offenders who cause death through abusive, degrading or dangerous sexual behaviour – or so-called ‘rough sex’, said the MoJ.
AI-generated deepfake images have become more prevalent in recent years, with images being viewed millions of times a month across the world. The fake images and videos are made to look hyper-realistic with the victim usually unaware and unable to give their consent to being sexualised in such a way.