Skip to content
Breaking News Alert Georgia House Guts Bill That Would Have Given Election Board Power To Investigate Secretary Of State

New ‘Deep Fake’ App That Undresses Women More Reason To Restrict Internet Porn

Share

“The problem with pornography is not that it shows too much of the person, but that it shows far too little,” said Pope John Paul II. Thanks to a new artificial intelligence service, it doesn’t matter how much our photos show or don’t show, it can all be turned into pornography.

The automated service, freely available on the web, allows users to anonymously submit photos of clothed women and receive altered versions with the clothing removed, according to a 20 October article in the Washington Post. Already more than 100,000 females— including some under the age of 18 — have had their photos thus altered. If we required more proof that pornography needs to be regulated, this technology provides it.

Because the AI technology relies on large databases of actual nude photographs, it can “generate fakes with seemingly lifelike accuracy, matching skin tone and swapping in breasts and genitalia where clothes once were,” notes the WaPo. No labels are appended to the images to designate them as fake. Orders are placed through an automated “chatbot” on Telegram, an encrypted messaging app. The service also assembles a photo gallery of the newly generated fake nudes so that users can have their own little “digital harem.”

It just gets more disturbing. An internal poll found that of the bot’s users, more than 60 percent were aiming to “undress” photos of girls or women they knew from real life. The chatbot’s administrator, whom the WaPo interviewed via messages on Telegram, declined to give his name but “defended the tool as a harmless form of sexual voyeurism.” Perverts will be perverts.

This artificial intelligence service is only the latest manifestation of a growing trend of manipulating graphics for not only sexual purposes, but to influence politics and even warfare. The “deepfake” video has been used to superimpose the faces of female celebrities and journalists onto other women engaged in sex scenes.

A 2019 report published by a Dutch cybersecurity startup estimated that 96 percent of all online deepfakes were pornographic. Deepfakes have been made of Daisy Ridley, Gal Gadot, Scarlett Johansson, Taylor Swift, and Maisie Williams, among others.

Obviously the proliferation of this technology is pernicious. University of Boston law professor Danielle Citron describes the vulnerability, shame, and anxiety that such deep-fakes foster in their victims. It can also threaten professional careers and romantic relationships. Often it is used as “revenge porn” by jilted lovers or creepers.

The technology is spreading, because its source code has been widely shared by online copycats. An earlier app called “DeepNude” that basically did the same thing caused such outrage in 2019 that the anonymous programmer deleted it only days after it went live. Yet Pandora’s box was already opened, and experts don’t see a way to stop similar software from being exploited for the same purposes across an unregulated Internet.

That may be true, but that doesn’t mean we shouldn’t seek to limit its advancement wherever we can. This technology is nothing other than a gross intrusion into people’s personal lives that dehumanizes and objectifies its victims. No one should have to fear that somewhere on the Web there are fake nude images of her, distributed by voyeurs eager to indulge in sick fantasies, or even worse, ruin reputations. Indeed, any man with a daughter, a sister, or a wife should be both disgusted and enraged that such people can operate with impunity.

These photos and videos are already being hosted on websites that aggregate pornography. Such sites are largely unregulated. Indeed, many are calling for shutting down PornHub, the largest pornography website in the world, because it does not police itself, and is responsible for enabling and profiting from the mass sex-trafficking and exploitation of women and minors. I would not be surprised if many similar websites are guilty of the same.

Pornography has always been a problem for a Western culture that believes human persons and their bodies have an inherent dignity that should not be exploited. Yet the deepfake phenomenon exposes perhaps its biggest blindspot: the murky waters of consent, that ersatz designator for what counts as legal and appropriate sex capitalism. As porn has become more widespread and accessible, it becomes increasingly difficult to ensure that consent has not been violated, especially when porn stars later publicly claim that their consent was.

Deepfakes and AI technology that creates porn from unwilling victims are the epitome of consent violations. Indeed, they threaten to turn anyone into an unwilling porn participant. We thus need to ask, what, if anything can be done to obstruct this worrying trend.

Victims of deepfakes can claim either defamation or copyright, or that they should be investigated under current obscenity laws. Platforms should also make their terms of service prohibit this sort of thing. Stronger revenge porn laws could help when that particular dynamic is in play.

That won’t likely stop this. New laws that punish those who create this multimedia or distribute it without consent would certainly help. Many states — including California, Virginia, Texas, Massachusetts, New York, and Maryland —have either already enacted deepfake laws, or legislation is pending. A more extreme political response would be to enact measures that make Internet porn harder to access, or laws that presume pornography to be non-consensual unless a legal bar is satisfied.

Such measures may seem extreme, but consider how you might feel if you, your spouse, or your child were to become the victim of a sexual deepfake. All of us have a right to a certain level of privacy, including the presumption that images and videos of us will not be exploited and disseminated without our consent by and for perverts and deviants. In a time when Big Tech already controls so much data about us, a few things should still remain sacrosanct.