The Horrifying Glimpse into the Lives of Deepfake Nude Generator’s Victims

By Caroline Haskins

As AI-powered image generators have become more accessible, so have websites that digitally remove the clothes of people in photos. One of these sites has an unsettling feature that provides a glimpse of how these apps are used: two feeds of what appear to be photos uploaded by users who want to “nudify” the subjects.

The feeds of images are a shocking display of intended victims. Some images clearly depict girls who are minors. Other photos display adults with captions intimating they are friends or strangers of the user. The website’s homepage does not show any false nude images that could have been created to non-registered visitors.

Users seeking to generate and save deepfake nude images are required to log into the site using a cryptocurrency wallet. There is no current pricing available, but as per a 2022 video shared by a related YouTube page, the website allowed users to purchase credits to generate deepfake nude images, beginning at 5 credits for $5. This site was discovered through a post on a subreddit about NFT marketplace OpenSea, which linked to this YouTube page. YouTube stated that they shut down the channel after being contacted by WIRED, and Reddit informed WIRED that the user had been suspended.

WIRED is not identifying the website, which is still online, to protect the women and girls who remain on its feeds. The site’s IP address, which went live in February 2022, belongs to internet security and infrastructure provider Cloudflare. When asked about its involvement, company spokesperson Jackie Dutton noted the difference between providing a site’s IP address, as Cloudflare does, and hosting its contents, which it does not.

WIRED notified the National Center for Missing & Exploited Children, which helps report cases of child exploitation to law enforcement, about the site’s existence.

AI developers like OpenAI and Stability AI say their image generators are for commercial and artistic uses and have guardrails to prevent harmful content. But open source AI image-making technology is now relatively powerful and creating pornography is one of the most popular use cases. As image generation has become more readily available, the problem of nonconsensual nude deepfake images, most often targeting women, has grown more widespread and severe. Earlier this month, WIRED reported that two Florida teenagers were arrested for allegedly creating and sharing AI-generated nude images of their middle school classmates without consent, in what appears to be the first case of its kind.

Mary Anne Franks, a professor at the George Washington University School of Law who has studied the problem of nonconsensual explicit imagery, says that the deepnude website highlights a grim reality: There are far more incidents involving AI-generated nude images of women without consent and minors than the public currently knows about. The few public cases were only exposed because the images were shared within a community, and someone heard about it and raised the alarm.

“There’s gonna be all kinds of sites like this that are impossible to chase down, and most victims have no idea that this has happened to them until someone happens to flag it for them,” Franks says.

The website reviewed by WIRED has feeds with apparently user-submitted photos on two separate pages. One is labeled “Home” and the other “Explore.” Several of the photos clearly showed girls under the age of 18.

One image showed a young girl with a flower in her hair standing against a tree. Another a girl in what appears to be a middle or high school classroom. The photo, seemingly taken discreetly by a classmate, is captioned “PORN.”

Andy Greenberg

Steven Levy

Andrew Couts

Nena Farrell

Another image on the site showed a group of young teens who appear to be in middle school: a boy taking a selfie in what appears to be a school gymnasium with two girls, who smile and pose for the picture. The boy’s features were obscured by a Snapchat lens that enlarged his eyes so much that they covered his face.

Annotations on the seemingly uploaded pictures pointed out they are photos of acquaintances, schoolmates, and sentimental partners. One annotation reads “My girlfriend”, displaying a young lady reflecting herself on a mirror.

Digital influencers who have gained recognition on TikTok, Instagram, and other social media platforms were seen in numerous photographs. Some of the photos looked to be the result of screenshotting Instagram posts where individuals share snapshot moments of their day-to-day lives. An image captured a joyful young lady with a dessert adorned with a festive candel.

A number of photographs depicted complete strangers to the individual snapping the picture. A photo taken from the back portrayed a woman or girl who isn’t posing for a picture, but standing near what seems to be a tourist spot.

Certain images in the collections scrutinized by WIRED were cut to eliminate the faces of women and girls, focusing only on their chest or lower body.

During an eight-day observation of the site, five new images of women appeared on the Home page, while three were found on the Explore page, according to WIRED. Most of these images gathered hundreds of “views” as listed on the site. The criteria of how pictures make it to the Home or Explore feed, or how views are calculated, remain uncertain. Each post on the Home page got at least a few dozen views.

Pictures of celebrities and individuals with large Instagram followers dominate the “Most Viewed” pictures on the website. The actor Jenna Ortega garners over 66,000 views, followed by singer-songwriter Taylor Swift with more than 27,000 views and a Malaysian influencer and DJ with over 26,000 views, making them the site’s most-viewed people.

Deepfake nudes have previously targeted Swift and Ortega. The diffusion of phony naked pictures of Swift on the X website in January sparked a fresh conversation on the effects of deepfakes and the necessity for increased legal safeguards for victims. NBC reported this month that Meta had, for seven months, been hosting advertisements for a deepnude app, which bragged about its capacity to “undress” people using a picture of Jenna Ortega when she was 16 years old.

In the US, there is no federal law that specifically addresses the dissemination of fake, non-consensual naked pictures. Several states have implemented their own laws. However, AI-generated naked pictures of minors fall under the same category as other child sexual abuse material, or CSAM, according to Jennifer Newman, the executive director of the NCMEC’s Exploited Children’s Division.

“If it is indistinguishable from an image of a live victim, of a real child, then that is child sexual abuse material to us,” Newman says. “And we will treat it as such as we’re processing our reports, as we’re getting these reports out to law enforcement.”

Andy Greenberg

Steven Levy

Andrew Couts

Nena Farrell

In 2023, Newman says, NCMEC received about 4,700 reports that “somehow connect to generative AI technology.”

People who want to create and save deepfake nude images on the site are asked to log in using either a Coinbase, Metamask, or WalletConnect cryptocurrency wallet. Coinbase spokesperson McKenna Otterstedt said that the company is launching an internal investigation into the site’s integration with the company’s wallet. WalletConnect and ConsenSys-owned Metamask did not respond to requests for comment.

In November 2022, the deepnude site’s YouTube channel posted a video claiming users could “buy credit” with Visa or Mastercard. Neither of the two payment processors returned WIRED’s requests for comment.

On OpenSea, an NFT marketplace, 30 NFTs were listed in 2022 featuring non-manipulated pictures of various Instagram and TikTok female influencers. Purchasing an NFT using ether cryptocurrency, equivalent to $280 as per the current rate, granted owners entry to the website, which was in the development phase according to web archives. Users’ privacy was the main concern, as indicated by the NFT listings.

The NFTs were sorted by tags relating to the women’s observed attributes. Tags included Boob Size, Country (predominantly Malaysia or Taiwan), and Traits, with tags like “cute,” “innocent,” and “motherly”.

The NFTs listed by the account were not sold. OpenSea removed the listings and the account within an hour and a half of WIRED contacting them. There were no responses from the women featured in the NFTs for comments.

Identification of the creator or owners of the deepnude website is unclear. The eliminated OpenSea account had a profile picture identical to the third result on Google Image when searching for “nerd”. It was stated in the account’s bio that the creator aims to “reveal the shitty thing in this world” and share it with “all douche and pathetic bros”.

An X account linked from the OpenSea account used the same bio and also linked to a now inactive blog about “Whitehat, Blackhat Hacking” and “Scamming and Money Making.” The account’s owner appears to have been one of three contributors to the blog, where he went by the moniker 69 Fucker.

The website was promoted on Reddit by just one user, who had a profile picture of a man of East Asian descent who appeared to be under 50. However, an archive of the website from March 2022 claims that the site “was created by 9 horny skill-full people.” The majority of the profile images appeared to be stock photos, and the job titles were all facetious. Three of them were Horny Director, Scary Stalker, and Booty Director.

An email address associated with the website did not respond for comment.

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Article

Comprehensive Guide on Dragons and Wyrmslife Crystals in Dragon's Dogma 2

Next Article

Cisco Recruits Ex-Microsoft, Broadcom Executive to Expand Networking Hardware Portfolio

Related Posts