AI is getting better at generating porn - TechCrunch

A red-haired woman standing on the moon, her face hidden. Her bare body looks like it belongs on a poster you’d find on a hormonal teenager’s bedroom wall—that is, until you get to her torso, where three arms sprout from her shoulders.

AI powered systems like Stable diffusionwhich translate text prompts into pictures, have been used by brands and artists to create award-winning concept images (although controversial) prints and overall marketing campaigns.

But some users intent on exploring the systems’ darker side are testing them for a different kind of use case: porn.

AI porn is about as disturbing and imperfect as you’d expect (that redhead on the moon probably wasn’t generated by someone with an extra arm fetish). But as the technology continues to improve, it will raise challenging questions for AI ethicists and sex workers alike.

The pornography, created using the latest image generation systems, first arrived on the scene via the 4chan and Reddit message boards earlier this month after a 4chan member leaked the open-source Stable Diffusion system ahead of its official release. Then, last week, it launched what appears to be one of the first websites dedicated to generating high-fidelity AI porn.

Called Porn Pen, the website allows users to customize the appearance of AI-generated nude models — all of whom are female — using toggle tags such as “babe,” “underwear model,” “chubby,” ethnicities (eg. “Russian” and “Latina”) and backgrounds (eg “bedroom”, “shower” and wildcards such as “moon”). Buttons capture models from the front, back, or side and change the appearance of the generated photo (eg “film photo”, “mirror selfie”). Still, there must be a bug in mirror selfies, because in the user-generated image feed, some mirrors don’t actually reflect a person — but of course, those models aren’t people at all. Porn Pen functions as a “This person does not exist,” only it’s NSFW.

In Y Combinator’s Hacking News forum, a user claiming to be the creator, describes Porn Pen as an “experiment” using cutting-edge text-to-image models. “I have explicitly removed the ability to specify custom text to avoid generating harmful images,” they wrote. “New tags will be added as the rapid engineering algorithm is further refined.” The creator did not respond to TechCrunch’s request for comment.

But Porn Pen raises a host of ethical questions, such as bias in the systems that generate the images and the sources of the data from which they arise. Beyond the technical implications, one wonders whether the new technology to create personalized porn—assuming it catches on—could harm the creators of adult content who make a living doing the same.

“I think it’s somewhat inevitable that this will happen when [OpenAI’s] DALL-E did it,” Os Keys, a PhD student at Seattle University, told TechCrunch via email. “But it’s still depressing how the default options and settings reproduce a very heteronormative and male gaze.”

Ashley, a sex worker and peer organizer who works on cases involving content moderation, believes that the content generated by Porn Pen is not a threat to sex workers in its current state.

“There’s endless media,” said Ashley, who did not want her last name published for fear of being harassed for her work. “But people excel not only by making the best media, but also by being approachable, interesting people. It will be a long time before AI can replace that.”

On existing monetized porn sites like OnlyFans and ManyVids, adult creators must verify their age and identity so the company knows they are consenting adults. AI-generated porn models can’t do that, of course, because they’re not real.

However, Ashley worries that if porn sites crack down on AI porn, it could lead to tighter restrictions on sex workers, who already face increased regulation from legislation such as SESTA/FOSTA. Congress presented Safer Sex Worker Research Act in 2019 to study the impact of this legislation making online sex work more difficult. This study found that “community organizations [had] report increased homelessness among sex workers” after losing “the economic stability provided by access to online platforms”.

“SESTA was sold as a fight against child sex trafficking, but it created a new criminal prostitution law that had nothing to do with age,” Ashley said.

Currently, few laws around the world refer to deepfaked porn. In the US, only Virginia and California have regulations restricting certain uses of fake and counterfeit pornographic media.

Systems like Stable Diffusion “learn” to generate images from text by example. Feed billions of photos tagged with annotations that indicate their content—for example, a photo of a dog labeled “Dachshund, wide-angle lens”—the systems learn that specific words and phrases refer to specific art styles, aesthetics, locations, and so on. .

This works reasonably well in practice. A prompt such as “a bird painted in the style of Van Gogh” would predictably result in a Van Gogh-style image depicting a bird. But it gets more complicated when the prompts are more vague, relate to stereotypes, or deal with subject matter the systems are unfamiliar with.

For example, Porn Pen sometimes generates images without a person at all – probably a system error to understand the prompt. Other times, as mentioned earlier, it is seen physically incredible models, usually with extra limbs, nipples in unusual places, and contorted flesh.

“By definition [these systems are] will represent those whose bodies are accepted and valued in mainstream society,” Keyes said, noting that Porn Pen only has categories for cisnormative people. “It’s not surprising to me that you’d end up with a disproportionate number of women, for example.”

While Stable Diffusion, one of the systems that probably underlies Porn Pen, has relatively few “NSFW” images in its training dataset, early experiments by Redditors and 4chan users show it to be quite adept at generating pornographically deep fakes of celebrities (Porn Pen—perhaps not coincidentally—has a “celebrity” option). And since it’s open source, nothing would stop the creator of Porn Pen from fine-tuning the system on additional nude images.

“Definitely not great to generate [porn] of an existing person,” Ashley said. “It can be used to harass them.”

Deepfake porn is often created to threaten and harass people. These images are almost always developed without the subject’s consent malicious intent. In 2019, the research company AI for sensitivity found that 96% of deeply fake videos online were non-consensual pornography.

Mike Cook, an artificial intelligence researcher who is part of the Knives and Paintbrushes collective, says there is a possibility that the dataset includes people who have not consented to their image being used for training in this way, including sex workers.

“Many of [the people in the nudes in the training data] may derive their income from the production of pornography or pornography-related content,” Cook said. “Just like fine artists, musicians or journalists, these people’s works are used to create systems that also undermine their ability to make a living in the future.”

In theory, a porn actor could use copyright protection, defamation and potentially even human rights laws to fight the creator of a deeply fake image. But as a piece in the MIT Technology Review notes, the gathering evidence supporting the legal argument can prove to be a formidable challenge.

When more primitive AI tools popularized deepfake porn a few years ago, Wired an investigation found that deeply fake non-consensual videos were garnering millions of views on mainstream porn sites like Pornhub. Other deeply fake works found a home on sites like Porn Pen — according to Sensity’s data, the top four deeply fake porn websites received more than 134 million views in 2018.

“AI imaging is now a widespread and affordable technology, and I don’t think anyone is really prepared for the implications of that ubiquity,” Cook continued. “I think we’ve rushed very, very far into the unknown in the last few years without paying attention to the impact of this technology.”

According to Cook, one of the most popular AI-generated pornography sites expanded late last year through affiliate agreements, referrals and APIs, allowing the service, which hosts hundreds of consentless deep fakes, to survive bans on its payment infrastructure. And in 2020, the researchers discovered a Telegram bot that generated offensive deepfake images of more than 100,000 women, including underage girls.

“I think in the next decade we’re going to see a lot more people testing the boundaries of both technology and societal boundaries,” Cook said. “We need to take some responsibility for this and work to educate people about the consequences of what they’re doing.”

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *