On-Prem

Public Sector

UK government pledges law against sexually explicit deepfakes

Not just making them, but sharing them too


The UK government has promised to make the creation and sharing of sexually explicit deepfake images a criminal offence.

It said the growth of artificially created but realistic images was alarming and caused devastating harm to victims, particularly the women and girls who are often the target.

The government promises to introduce a new offence, meaning perpetrators could be charged for both creating and sharing these images under the government's Crime and Policing Bill, which will be introduced when parliamentary time allows.

It will also create new offences for the taking of intimate images without consent while those installing equipment for the purpose of making intimate images without consent are also set to be covered by the law.

In a statement, victims minister Alex Davies-Jones said: "It is unacceptable that one in three women have been victims of online abuse. This demeaning and disgusting form of chauvinism must not become normalized.

"These new offences will help prevent people being victimized online. We are putting offenders on notice – they will face the full force of the law," she said.

A two-year jail term could apply to both criminals who take an intimate image without consent and those who install equipment for that purpose.

In a statement Baroness Jones, technology minister, said: "With these new measures, we're sending an unequivocal message: creating or sharing these vile images is not only unacceptable but criminal. Tech companies need to step up too - platforms hosting this content will face tougher scrutiny and significant penalties."

The Justice Ministry said sexually explicit deepfake offences are set to apply to images of adults, as the law already covers such images of children.

It is already an offence to share or threaten to share intimate images, including deepfakes, under the Sexual Offences Act 2003, following amendments that were made by the Online Safety Act 2023.

In September last year, some of the largest AI firms in America promised to prevent their AI products from being used to generate non-consensual deepfake pornography and child sexual abuse material.

Adobe, Anthropic, Cohere, Microsoft, OpenAI, and open source web data repository Common Crawl were among those making the non-binding commitments to the Biden administration.

Google's YouTube has also created privacy guidelines that allow people to request the removal of AI-generated videos that mimic them, the company said in July last year. ®

Send us news
29 Comments

Uncle Sam now targets six landlord giants in war on alleged algorithmic rent fixing

One of ya is gonna sing like a canary, prosecutors say

Microsoft eggheads say AI can never be made secure – after testing Redmond's own products

If you want a picture of the future, imagine your infosec team stamping on software forever

Megan, AI recruiting agent, is on the job, giving bosses fewer reasons to hire in HR

She doesn't feel pity, remorse, or fear, but she'll craft a polite email message as she turns you down

Google reports halving code migration time with AI help

Chocolate Factory slurps own dogfood, sheds drudgery in specific areas

Sage Copilot grounded briefly to fix AI misbehavior

'Minor issue' with showing accounting customers 'unrelated business information' required repairs

OpenAI's ChatGPT crawler can be tricked into DDoSing sites, answering your queries

The S in LLM stands for Security

CoreWeave drops £1bn in UK datacenters – but don't expect the latest Nvidia magic just yet

Rent-a-GPU outfit's latest datacenters are packed to the brim with H200s

Free-software warriors celebrate landmark case that enforced GNU LGPL

On the Fritz: German router maker AVM lets device rights case end after coughing up source code

Allstate accused of quietly paying app makers for driver data

Insurance giant sued by Texas for using surveillance without consent to jack up premiums, deny coverage

AI pothole patrol to snap flaws in Britain's crumbling roads

Now if only the councils could afford to fill them

Brit government contractor CloudKubed enters administration

Home Office, Department for Work and Pensions supplier in hands of FRP Advisory

Apple's interoperability efforts aren't meeting spirit or letter of EU law, advocacy groups argue

Free Software Foundation Europe and others urge European Commission to double down on DMA