“The school shooter played Doom on his IBM Personal Computer” vibe all over again.
It’s abuse if there is no one being abused?
Initial training picture may come from abused child’s. And we can theorize that they will need more to keep training the AI.
You’re thinking
OpenAI(EDIT: Runway, sorry, got mixed up with ChatGPT) put CSEM in their data set? Or maybe an accident?I may be mistaken but it look more like someone used a pretrained image generator and retrained it with child pics. The following site for exemple teach you to build and train a image generator you just have to change the kind of pictures it is trained on to make it malevolent. https://www.assemblyai.com/blog/minimagen-build-your-own-imagen-text-to-image-model/
Installed this. Made no changes to it. Ran a few queries. Most was nightmare fuel of severed limbs and crazy teeth etc as I have no clue what I’m doing. But still, with enough tries… it generated it. So, confirmed that you don’t need to introduce CP to get naked AI kids.
It’s possible, you can easily train it with new images. Is it also possible that it can extrapolate adult parts smaller, or does it not work that way? I don’t know very much about this stuff at all.
Couple places I know with SD models, etc: https://huggingface.co and https://civitai.com/
You’re right, it does work that way - it’s why ‘a photo of an astronaut riding a horse’ is the standard demo for SD, to show that it can create things it wasn’t trained on by remixing and extrapolating elements. Even without that though, it can do things like turn a cartoon image into a realistic one (or vise versa) with img2img without necessarily needing to know what the content is at all.
Also, it’s possible to recursively train models - create a rough model, use its output as training data for a more refined model, rinse and repeat. I’ve found it works well for getting a strong and consistent face LoRA, but I imagine the same method could be used to create any sort of model without using real photos.
Great tips, thanks!
Thanks, learnt a thing.
As described in the article, there is a no tolerance policy because of the danger of switching from generated to real. Children are not sexual objects. Whoever feels different needs help.
deleted by creator
What’s even legal in britbong land? I think they tried to outlaw anything equally or more kinky as “girl on top” for a hot minute before there was a big backlash. I have no idea what their “obscenity” laws are now.
I sawed the daemons: they’re over on Pawoo constantly sharing their fucking CP. I guess it’s easier to go after artists than catch criminals harming actual children. Had more to say, but it turned into a rant.
Could AI-Porn not be something that makes actual CP go extinct? I have the feeling that it is not at all about protecting children but simply about hating on the “perverts”.
The issue in this instance is that it was trained with actual images. Much the same way that using copyrighted material to train is illegal (barring getting permission), simply having it train on such images is illegal.