DIY Facial Recognition for Porn Is a Dystopian Disaster

Authored by vice.com and submitted by speckz

Image via Pornhub / Composition via Samantha Cole

Someone posting on Chinese social network Weibo claims to have used facial recognition to cross-reference women’s photos on social media with faces pulled from videos on adult platforms like Pornhub.

In a Monday post on Weibo, the user, who says he's based in Germany, claimed to have “successfully identified more than 100,000 young ladies” in the adult industry “on a global scale.”

To be clear, the user has posted no proof that he’s actually been able to do this, and hasn’t published any code, databases, or anything else besides an empty GitLab page to verify this is real. When Motherboard contacted the user over Weibo chat, he said they will release “database schema” and “technical details” next week, and did not comment further.

Still, his post has gone viral in both China on Weibo and in the United States on Twitter after a Stanford political science PhD candidate tweeted them with translations, which Motherboard independently verified. This has led prominent activists and academics to discuss the potential implications of the technology.

According to Weibo posts, the user and some of his programming friends used facial recognition to detect faces in porn content using photos from social platforms. His reasoning for making this program, he wrote, is “to have the right to know on both sides of the marriage.” After public outcry, he later claimed his intention was to allow women, with or without their fiancées, to check if they are on porn sites and to send a copyright takedown request.

"This is horrendous and a pitch-perfect example of how these systems, globally, enable male dominance," Soraya Chemaly, author of Rage Becomes Her, tweeted on Tuesday about the alleged project. "Surveillance, impersonation, extortion, misinformation all happen to women first and then move to the public sphere, where, once men are affected, it starts to get attention."

Whether the Weibo user’s claims are trustworthy or not is beside the point, now that experts in feminist studies and machine learning have decried this project as algorithmically-targeted harassment. This kind of program’s existence is both possible and frightening, and has started a conversation around whether such a program would be an ethically or legally responsible use of AI.

Just as we saw with deepfakes, which used AI to swap the faces of female celebrities onto the bodies of porn performers, the use of machine learning to control and extort women's bodily autonomy demonstrates deep misogyny. It's a threat that didn't begin with deepfakes, but certainly reached a public sphere with that technology—although in the years since, women have been left behind in the mainstream narrative, which has focused on the technology’s possible use for disinformation.

Danielle Citron, a professor of law at the University of Maryland who's studied the aftermath of deepfakes, also tweeted about this new claim on Weibo. "This is a painfully bad idea—surveillance and control of women’s bodies taken to new low," she wrote.

What he claims to have done is theoretically possible for someone with a decent amount of machine learning and programming knowledge, given enough time and computing power, though it would be a huge effort with no guarantee of quality.

The ability to create a database of faces like this, and deploy facial recognition to target and expose women within it, has been within consumer-level technological reach for some time.

In 2017, Pornhub proudly announced new facial recognition features that it claimed would make it easier for users to find their favorite stars—and, in turn, theoretically easier for abusers or harassers to find their targets. As I wrote at the time:

Even if Pornhub deploys this technology in an ethical way, its existence should be concerning. Such technology is unlikely to stay proprietary for long, and given that some people on the internet make a habit of identifying amateur or unwitting models, the underlying tech could supercharge some of these efforts.

In 2018, online trolls started compiling databases of sex workers, in order to threaten and out them. This harassment campaign had real-life consequences, with some sex workers having their payment processors or social media platforms shut down.

What this Weibo programmer is claiming to have built is a combination of these two ideas: A misogynistic, abusive attempt at controlling women. Whether it's real or not, it's representative of the dark paths where machine learning technology—and some of the societal toxicity around it—has taken us.

Jordan Pearson contributed reporting to this story.

Dimoson on May 29th, 2019 at 16:58 UTC »

This already happaned in russia with VK and FindFace app

https://mashable.com/2016/05/03/facial-recognition-russia-shame-sex-workers/

Orcus424 on May 29th, 2019 at 15:32 UTC »

This technology was on an episode on Better off Ted about 10 years ago. They found out their department head was a magicians assistant in her spare time.

Redditing-Dutchman on May 29th, 2019 at 15:12 UTC »

Thats why, in the near future, I think more and more people will delete their social media accounts. This is just one example. Just a matter of time before you can point your phone camera to someone walking on the street, and see all their available personal data instantly. This will hit some people hard. A small crime can follow you the rest of your life, if it was posted somewhere online. (already happening to a certain extent now, with old tweets for example)

Edit: people talking about deepfakes and stuff. Keep in mind that Samsung can now make a video of you talking about a certain subject with just a single profile picture. Another company is working on creating fake audio that sounds just like you, after just a few sentences of hearing you. Combine these 2 and it starts to become very scary.