U.K. Criminalizes Creating Sexually Explicit Deepfake Images

Authored by time.com and submitted by Minifiax
image for U.K. Criminalizes Creating Sexually Explicit Deepfake Images

The U.K. will criminalize the creation of sexually explicit deepfake images as part of plans to tackle violence against women.

People convicted of creating such deepfakes without consent, even if they don’t intend to share the images, will face prosecution and an unlimited fine under a new law, the Ministry of Justice said in a statement. Sharing the images could also result in jail.

Rapid developments in artificial intelligence have led to the rise of the creation and dissemination of deepfake images and videos. The U.K. has classified violence against women and girls as a national threat, which means the police must prioritize tackling it, and this law is designed to help them clamp down on a practice that is increasingly being used to humiliate or distress victims.

Read More: As Tech CEOs Are Grilled Over Child Safety Online, AI Is Complicating the Issue

“This new offence sends a crystal clear message that making this material is immoral, often misogynistic, and a crime,” Laura Farris, minister for victims and safeguarding, said in a statement.

The government is also introducing new criminal offenses for people who take or record real intimate images without consent, or install equipment to enable someone to do so. A new statutory aggravating factor will be brought in for offenders who cause death through abusive, degrading or dangerous sexual behavior.

stickmanDave on April 21st, 2024 at 02:00 UTC »

This is just the start.

Before long it will be possible to train an AI to make a 3D model of a person from photos. Crude versions of this already exist. And software will exist to make video of this model doing pretty much anything the creator can imagine.

But once created, there will be nothing to specifically tie this model to the person it's based on besides a the subjective decision that they look similar.

I don't see how that could be regulated at all. Even assuming some sort of objective measurement tool could be made to determine if a given model looks too much like a specific person, a tool to make the model look just barely pass the test would not be far behind.

jaa101 on April 21st, 2024 at 00:37 UTC »

"U.K. to Criminalize Creating Sexually Explicit Deepfake Images" is the actual headline now. The proposed law is yet to be introduced into Parliament.

Minifiax on April 21st, 2024 at 00:04 UTC »

It makes me so nervous when sources only specify “against women”, as if they just don’t care about if it happens to men or not.