“The school shooter played Doom on his IBM Personal Computer” vibe all over again.

    • Diurnambule@lemmy.fmhy.ml
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Initial training picture may come from abused child’s. And we can theorize that they will need more to keep training the AI.

      • Pink Bow@burggit.moe
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        You’re thinking OpenAI (EDIT: Runway, sorry, got mixed up with ChatGPT) put CSEM in their data set? Or maybe an accident?

          • Pink Bow@burggit.moe
            link
            fedilink
            English
            arrow-up
            6
            ·
            1 year ago

            Installed this. Made no changes to it. Ran a few queries. Most was nightmare fuel of severed limbs and crazy teeth etc as I have no clue what I’m doing. But still, with enough tries… it generated it. So, confirmed that you don’t need to introduce CP to get naked AI kids.

          • Pink Bow@burggit.moe
            link
            fedilink
            English
            arrow-up
            6
            ·
            edit-2
            1 year ago

            It’s possible, you can easily train it with new images. Is it also possible that it can extrapolate adult parts smaller, or does it not work that way? I don’t know very much about this stuff at all.

            Couple places I know with SD models, etc: https://huggingface.co and https://civitai.com/

            • CyanParsnips@burggit.moe
              link
              fedilink
              English
              arrow-up
              5
              ·
              1 year ago

              You’re right, it does work that way - it’s why ‘a photo of an astronaut riding a horse’ is the standard demo for SD, to show that it can create things it wasn’t trained on by remixing and extrapolating elements. Even without that though, it can do things like turn a cartoon image into a realistic one (or vise versa) with img2img without necessarily needing to know what the content is at all.

              Also, it’s possible to recursively train models - create a rough model, use its output as training data for a more refined model, rinse and repeat. I’ve found it works well for getting a strong and consistent face LoRA, but I imagine the same method could be used to create any sort of model without using real photos.

    • Haui@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      As described in the article, there is a no tolerance policy because of the danger of switching from generated to real. Children are not sexual objects. Whoever feels different needs help.

  • mcuglys@burggit.moe
    link
    fedilink
    English
    arrow-up
    14
    ·
    1 year ago

    What’s even legal in britbong land? I think they tried to outlaw anything equally or more kinky as “girl on top” for a hot minute before there was a big backlash. I have no idea what their “obscenity” laws are now.

  • Pink Bow@burggit.moe
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    I sawed the daemons: they’re over on Pawoo constantly sharing their fucking CP. I guess it’s easier to go after artists than catch criminals harming actual children. Had more to say, but it turned into a rant.

  • Obonga@burggit.moe
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    Could AI-Porn not be something that makes actual CP go extinct? I have the feeling that it is not at all about protecting children but simply about hating on the “perverts”.

    • KilotonPress@burggit.moe
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      The issue in this instance is that it was trained with actual images. Much the same way that using copyrighted material to train is illegal (barring getting permission), simply having it train on such images is illegal.