Skip to content
Breaking News Alert Biden DOJ Says Droning American Citizens Is Totally Fine Because Obama’s DOJ Said So

Your Daughter’s Face Could Be Hijacked For ‘Deepfake’ Porn

We are barely scratching the surface of the dystopian spike in image-based sexual abuse.

Share

Fourteen-year-old Francesca’s life has changed forever.  

An email sent by her high school’s principal to her family on Oct. 20, 2023, notified them that Francesca was one of more than 30 students whose images had been digitally altered to appear as synthetic sexually explicit media — sometimes referred to as “deepfake” pornography.  

Speaking to media, Francesca shared how she felt betrayed, saying, “We need to do something about this because it’s not OK, and people are making it seem like it is.” She’s right — something must be done.

The issue of image-based sexual abuse (IBSA) — whether it manifests as nonconsensual AI or “deepfake” content, nonconsensual recording or sharing of explicit imagery, extortion or blackmail based on images, recorded sexual abuse, or its many other manifestations — can feel like something that happens to “others.” It’s a headline we scroll past. It can feel distant from our own lives. But that’s far from the truth. 

If anyone has ever taken a video or photo of you and posted it online, even an innocent family photo or professional headshot, that’s all it takes.

You and your loved ones are personally at risk for having your images turned into sexually explicit “synthetic,” “nudified,” or “deepfake” content.

It doesn’t take a tech genius on the dark web to do this, as the code and tools to make this are free on open-source, popular websites like Microsoft’s GitHub and are shared widely online. In fact, GitHub hosts the source code to the software used to create 95 percent of sexual deepfakes despite being notified of the exploitative code by anti-exploitation advocates.

These kinds of images can be created in less time than it takes to brew a cup of coffee.

Even if people don’t know how to create these images, they can openly solicit and pay for others to do so on websites like Reddit, where entire communities exist based on trading and creating nonconsensual explicit material.

And here’s the kicker, these images aren’t some sloppy photoshop of a face onto a body, a la 1990. Top executives at one of the most innovative technology companies in the world have told us that they themselves typically cannot tell if an image is synthetic, artificial pornography or not. There isn’t some easy watermark that separates fake versus real.

And of course, sexual imagery is often consensually created and shared between romantic partners and then shared nonconsensually later, sometimes called revenge pornography. A 2017 survey found that one in eight participants had been targets of this distribution, or threat of distribution, without their consent. Not to mention the countless numbers of adult sex trafficking or abuse survivors who have their exploitation recorded. These problems are also rampant within the pornography industry as we’ve seen on PornhubXHamster, and other pornography sites.

Victims can face an uphill battle, and many try to face this alone. Most victims of IBSA (73 percent) didn’t turn to anyone for help. At most, you can contact the social media company and ask them to take it down, with mixed results, or maybe your state law could hold the person who uploaded the image liable. But this doesn’t stop the same image from being re-uploaded and shared often to the same platform, or to other platforms as well. Bumble and a few other companies are beginning to hash IBSA images to prevent them from being re-uploaded; the technology exists and is effective, but on the whole few tech companies take advantage of it. You could spend hours a day searching and reporting explicit imagery of yourself, only to wake up the next day and it’s all back online.

We are only at the beginning of a dystopian spike in image-based sexual abuse thanks to recently emerged technologies and online platforms like social media companies that facilitate sexually explicit material with NO meaningful age or consent verification.

Companies like Pornhub, X, Snapchat, and others functionally allow anyone to share explicit materials of anyone. They put the burden on survivors to learn about and report their own exploitation.  

Online platforms can and must do more. For mainstream social media sites like X, formerly Twitter, that allow pornography, it’s become clear that they should stop hosting explicit material altogether as they don’t have the infrastructure or profit incentive to enact meaningful age and consent verification for anyone who appears in that photo or video.

Online pornography sites like Pornhub and XVideos should also require meaningful identity, age, and consent verification of each person depicted in explicit content, and be held liable if those methods fail. Further, all online platforms must hash every image of IBSA to prevent it from being re-uploaded.

Enough people have been victimized for the lesson to be clear: If a platform can’t verify explicit content, they should not host it.

Laws also must be created to help put pressure on companies to take this issue seriously. In the past, Congress introduced the Protect Act, which would require a degree of consent verification for sexually explicit material. It has yet to be reintroduced.

Image-based sexual abuse is not a problem for others, it is our problem. And now is a vital time in history to put the responsibility on technology companies to prevent the creation and distribution of sexual exploitation. Protecting human dignity deserves no less.


0
Access Commentsx
()
x