Women, Not Politicians, Are Targeted Most Often by Deepfake Videos

Authored by cigionline.org and submitted by BurstYourBubbles
image for Women, Not Politicians, Are Targeted Most Often by Deepfake Videos

In a culture that is rife with misinformation and disinformation, it can be easy for people to be duped into believing they are reading or seeing something that has no base in reality. Deepfake videos have added to this confusion, sometimes presenting content that is meant to deceive the viewer or to drastically misrepresent the person in the video. With the advent of deepfakes, viewers now need to question whether what they are seeing in a video is real or not.

Much of the public concern about deepfakes has centred on fears about them being used to disrupt politics or business. But the reality is that deepfake technology is predominately being used to create sexual videos of women without their consent.

Deepfake videos are a form of synthetic media that uses artificial intelligence to swap out the faces of people in videos. When done well, these realistic videos can be quite convincing, making a puppet out of the person featured in the film.

Fake videos have been made of politicians endorsing views contrary to their own, public figures confessing to wrongdoings, and women engaging in sexual activities they never engaged in. Some of these videos are clearly deepfakes, due to their low-quality visual effects, unusual contextual setting or the explicit acknowledgement that they are deepfakes. But many others are nearly impossible to distinguish from a real video and are not labelled as fakes.

Their initial popularity was fuelled by the non-consensual creation of sexual deepfakes of female celebrities. In 2017, Motherboard journalist Samantha Cole reported that publicly available open source software made it possible for anyone with some programming skills and a decent graphics card to create these types of videos.

SlySychoGamer on March 9th, 2021 at 06:17 UTC »

Imagine that, guys would rather see celeb face swap porn rather than obama or trump saying they will nuke someone.

Splurch on March 9th, 2021 at 05:55 UTC »

We'll likely see more against politicians as the tech gets better. They'll start becoming a problem when they're good enough that you can't tell they're a fake while watching.

smoke_and_spark on March 9th, 2021 at 03:52 UTC »

You mean porn....