Adult Graphics & Photos Using Artificial Intelligence AI - Stable Diffusion, DALL-E

Post Reply
User avatar
Luke
Posts: 5701
Joined: Mon Feb 22, 2021 1:21 pm
Location: @orly_licious With Pete Buttigieg and the other "open and defiant homosexuals" --Bryan Fischer AFA

Adult Graphics & Photos Using Artificial Intelligence AI - Stable Diffusion, DALL-E

#1

Post by Luke »

The speed of AI's progress is nothing short of magical and potentially dangerous.

This site doesn't even require registration, just type in what you want to see, and bam: https://stablediffusionweb.com/#demo

Examples
A high tech solarpunk utopia in the Amazon rainforest
A pikachu fine dining with a view to the Eiffel Tower
A mecha robot in a favela in expressionist style
an insect robot preparing a delicious meal
A small cabin on top of a snowy mountain in the style of Disney, artstation

But realistically, every technology improvement has started with adult content, from video tape recorders to the first big commercial uses of the internet. Now AI is taking the adult road. Rolling Stone:
They’re Selling Nudes of Imaginary Women on Reddit — and It’s Working
EJ Dickson Mon, April 10, 2023 at 10:59 AM

“F/19 feeling pretty today,” Claudia’s post reads. She’s got straight black bangs and giant blue-green eyes, with just the socially appropriate amount of cleavage sticking out of her grey tank top. With her alabaster skin, delicate features, and vaguely indie hairstyle, she looks exactly like someone the average Redditor would obsess over, and indeed, the comments on Claudia’s post on the subreddit r/faces are all variations of, “hot” and “you’re very gorgeous.” Except for one.

“For those who aren’t aware, I’m going to kill your fantasy,” the comment reads. “This is literally an AI creation, if you’ve worked with AI image models and making your own long enough, you can 10000% tell. Sorry to ruin the surprise, I guess.” Claudia is, indeed, an AI-generated creation, who has posted her (AI-generated) lewd photos on other subreddits, including r/normalnudes and r/amihot. She’s the brainchild of two computer science students who tell Rolling Stone they essentially made up the account as a joke, after coming across a post on Reddit from a guy who made $500 catfishing users with photos of real women. They made about $100 selling her nudes until other redditors called out the account, though they continue to post lewds on other subreddits.

“You could say this whole account is just a test to see if you can fool people with AI pictures,” says the team behind Claudia, who declined to disclose their real names. “You could compare it to the vtubers, they create their own characters and play as an entirely different person. We honestly didn’t think it would get this much traction.”

Claudia was created by Stable Diffusion, an AI program that uses machine learning to generate shockingly realistic-looking photos using nothing but a text prompt. (In this case, the text prompt was a selfie of a woman in her house “without makeup with black hair, shoulder length hair, simple background, straight hair, hair bangs.”) Her post on r/faces prompted a firestorm of users reporting the post, leading a moderator for the group, who asked not to be named, to post a disclaimer clarifying that AI-generated photos are not against the subreddit’s rules. “I take a caveat emptor approach with these things,” the moderator tells Rolling Stone. “If people think it is real and want to do something outside of my subreddit, that is on them.”

Claudia is among the first, but by no means the last, fictional adult content creator to be generated via rapidly evolving AI technology, prompting a slew of ethical questions and concerns. Most discussions about the dangers posed by AI and adult content have focused on the prevalence of deepfakes, a term used to describe an image or video that uses a person’s face without their consent. According to Sensity, an AI firm, nearly 96 percent of all deepfakes are pornographic in nature and feature a woman’s face being used without their consent. Though many platforms, like Reddit, ostensibly have policies preventing the proliferation of deep fakes, such content is fairly easy to find online, with some Discord communities selling deepfake porn of “personal girls” — meaning non-celebrities — for as little as $5 a pop, according to an NBC News report.
► Show Spoiler
https://www.yahoo.com/entertainment/sel ... 12835.html
Lt Root Beer of the Mighty 699th. Fogbow 💙s titular Mama June in Fogbow's Favourite Show™ Mama June: From Not To Hot! Fogbow's Theme Song™ Edith Massey's "I Got The Evidence!" https://www.youtube.com/watch?v=C5jDHZd0JAg
User avatar
p0rtia
Posts: 5083
Joined: Mon Feb 22, 2021 9:55 am

Sexy & Other Graphics & Photos Using Artifical Intelligence AI - Stable Diffusion, DALL-E

#2

Post by p0rtia »

Could you maybe split this up into two threads, Orly?

I am fascinated by AI produced images, and am over the moon about the recent posting of congressrats in drag--but I don't want to see sexy AI graphics at all. However you define sexy. Don't want to disenfranchise those who do want to see them, so, a split?


What do you think?
User avatar
Luke
Posts: 5701
Joined: Mon Feb 22, 2021 1:21 pm
Location: @orly_licious With Pete Buttigieg and the other "open and defiant homosexuals" --Bryan Fischer AFA

Sexy & Other Graphics & Photos Using Artifical Intelligence AI - Stable Diffusion, DALL-E

#3

Post by Luke »

Sure! I've been thinking about how to do that... I'll keep this topic since it talks about the "virtual girlfriend" and "vitrual boyfriend" images that are being sold as actual people above, but I'll make another topic that's just about the AI graphics. 💙
Lt Root Beer of the Mighty 699th. Fogbow 💙s titular Mama June in Fogbow's Favourite Show™ Mama June: From Not To Hot! Fogbow's Theme Song™ Edith Massey's "I Got The Evidence!" https://www.youtube.com/watch?v=C5jDHZd0JAg
User avatar
p0rtia
Posts: 5083
Joined: Mon Feb 22, 2021 9:55 am

Sexy & Other Graphics & Photos Using Artifical Intelligence AI - Stable Diffusion, DALL-E

#4

Post by p0rtia »

orlylicious wrote: Wed Apr 12, 2023 2:15 pm Sure! I've been thinking about how to do that... I'll keep this topic since it talks about the "virtual girlfriend" and "vitrual boyfriend" images that are being sold as actual people above, but I'll make another topic that's just about the AI graphics. 💙
Thanks! :lovestruck:
User avatar
Luke
Posts: 5701
Joined: Mon Feb 22, 2021 1:21 pm
Location: @orly_licious With Pete Buttigieg and the other "open and defiant homosexuals" --Bryan Fischer AFA

Adult Graphics & Photos Using Artificial Intelligence AI - Stable Diffusion, DALL-E

#5

Post by Luke »

From VCR's to the widespread adoption of the web to now this, adult material is always the driver of new technology. That's why I created this topic. Government can never keep up with technological innovation. This is going to be an enormous problem: AI generated CP. Extremely disturbing and very, very difficult to stop it. Law enforcement, government and voluntary (or if necessary, mandatory) AI watermarking need to be addressed quickly. I'd even think this is something the UN should start dealing with.
AI-generated child sex images spawn new nightmare for the web
Investigators say the disturbing images could undermine efforts to find real-world victims
By Drew Harwell June 19, 2023 at 7:00 a.m. EDT

The revolution in artificial intelligence has sparked an explosion of disturbingly lifelike images showing child sexual exploitation, fueling concerns among child-safety investigators that they will undermine efforts to find victims and combat real-world abuse. Generative-AI tools have set off what one analyst called a “predatory arms race” on pedophile forums because they can create within seconds realistic images of children performing sex acts, commonly known as child pornography.

Thousands of AI-generated child-sex images have been found on forums across the dark web, a layer of the internet visible only with special browsers, with some participants sharing detailed guides for how other pedophiles can make their own creations. “Children’s images, including the content of known victims, are being repurposed for this really evil output,” said Rebecca Portnoff, the director of data science at Thorn, a nonprofit child-safety group that has seen month-over-month growth of the images’ prevalence since last fall. “Victim identification is already a needle in a haystack problem, where law enforcement is trying to find a child in harm’s way,” she said. “The ease of using these tools is a significant shift, as well as the realism. It just makes everything more of a challenge.”

The flood of images could confound the central tracking system built to block such material from the web because it is designed only to catch known images of abuse, not detect newly generated ones. It also threatens to overwhelm law enforcement officials who work to identify victimized children and will be forced to spend time determining whether the images are real or fake. The images have also ignited debate on whether they even violate federal child-protection laws because they often depict children who don’t exist. Justice Department officials who combat child exploitation say such images still are illegal even if the child shown is AI-generated, but they could cite no case in which a suspect had been charged for creating one.

The new AI tools, known as diffusion models, allow anyone to create a convincing image solely by typing in a short description of what they want to see. The models, such as DALL-E, Midjourney and Stable Diffusion, were fed billions of images taken from the internet, many of which showed real children and came from photo sites and personal blogs. They then mimic those visual patterns to create their own images. The tools have been celebrated for their visual inventiveness and have been used to win fine-arts competitions, illustrate children’s books and spin up fake news-style photographs, as well as to create synthetic pornography of nonexistent characters who look like adults.

But they also have increased the speed and scale with which pedophiles can create new explicit images because the tools require less technical sophistication than past methods, such as superimposing children’s faces onto adult bodies using “deepfakes,” and can rapidly generate many images from a single command. It’s not always clear from the pedophile forums how the AI-generated images were made. But child-safety experts said many appeared to have relied on open-source tools, such as Stable Diffusion, which can be run in an unrestricted and unpoliced way. Stability AI, which runs Stable Diffusion, said in a statement that it bans the creation of child sex-abuse images, assists law enforcement investigations into “illegal or malicious” uses and has removed explicit material from its training data, reducing the “ability for bad actors to generate obscene content.”
► Show Spoiler

https://www.washingtonpost.com/technolo ... se-images/

I'm in contact with some of the sources in this story. If anyone has any ideas about how/what to do to mitigate this, I'll forward them along. I've been involved since the 90's on these issues in a role that included creating Section 230 and the beginning of the internet's mass commercialization. Steve Case, founder of AOL, told me a super simple statement that's been true ever since: "It always takes longer than you expect, and then it happens faster than you expect". I know many here have broad and deep experience in IT and internet science and would appreciate your thoughts.

Tweeted it to the UN and UN Human Rights, the UN's Human Rights Counsel's 53rd session starts today.



Lt Root Beer of the Mighty 699th. Fogbow 💙s titular Mama June in Fogbow's Favourite Show™ Mama June: From Not To Hot! Fogbow's Theme Song™ Edith Massey's "I Got The Evidence!" https://www.youtube.com/watch?v=C5jDHZd0JAg
User avatar
Foggy
Dick Tater
Posts: 9651
Joined: Mon Feb 22, 2021 8:45 am
Location: Fogbow HQ
Occupation: Dick Tater/Space Cadet
Verified: as seen on qvc zombie apocalypse

Adult Graphics & Photos Using Artificial Intelligence AI - Stable Diffusion, DALL-E

#6

Post by Foggy »

It's a brave new world ... and AI tech is still in its infancy ... :bored:
The more I learn about this planet, the more improbable it all seems. :confuzzled:
User avatar
bill_g
Posts: 5559
Joined: Mon Feb 22, 2021 5:52 pm
Location: Portland OR
Occupation: Retired (kind of)
Verified: ✅ Checked Republic ✓ ᵛᵉʳᶦᶠᶦᵉᵈ

Adult Graphics & Photos Using Artificial Intelligence AI - Stable Diffusion, DALL-E

#7

Post by bill_g »

Deep porn. Wonderful.

Next ...
User avatar
Foggy
Dick Tater
Posts: 9651
Joined: Mon Feb 22, 2021 8:45 am
Location: Fogbow HQ
Occupation: Dick Tater/Space Cadet
Verified: as seen on qvc zombie apocalypse

Adult Graphics & Photos Using Artificial Intelligence AI - Stable Diffusion, DALL-E

#8

Post by Foggy »

Next, of course, is ultra-violent porn.

A Clockwork Orange, come to reality.

And in all honesty, maybe a real Earthling will be spared some horror because it can be done virtually instead. :shrug:
The more I learn about this planet, the more improbable it all seems. :confuzzled:
User avatar
bill_g
Posts: 5559
Joined: Mon Feb 22, 2021 5:52 pm
Location: Portland OR
Occupation: Retired (kind of)
Verified: ✅ Checked Republic ✓ ᵛᵉʳᶦᶠᶦᵉᵈ

Adult Graphics & Photos Using Artificial Intelligence AI - Stable Diffusion, DALL-E

#9

Post by bill_g »

Apparently you can/could get real life streaming monkey torture. There are some sick people out there.

https://www.bbc.com/news/world-65951188
Post Reply

Return to “Computers and Internet”