• FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    4 months ago

    You obviously don’t understand squat about AI.

    Ha.

    AI only knows what has gone through it’s training data, both from the developers and the end users.

    Yes, and as I’ve said repeatedly, it’s able to synthesize novel images from the things it has learned.

    If you train an AI with pictures of green cars and pictures of red apples, it’ll be able to figure out how to generate images of red cars and green apples for you.

    • over_clox@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      3
      ·
      4 months ago

      Exactly. And if you ask it for the opposite of an older MILF, then how does it know what younger ladies look like?

      • ricecake@sh.itjust.works
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        4 months ago

        Is an image of a child inappropriate? Fully clothed, nothing going on.

        Is the image of an adult engaging in sexual activity inappropriate?

        Based on those two concepts, it can generate inappropriate child sexual imagery.

        You may have done OCR work a while ago, but that is not the same type of machine learning that goes into typical generative AI systems in the modern world. It very much seems as though you are profoundly misunderstanding how this technology operates if you think it can’t generate a novel combination of previously trained concepts without a prior example.

        • over_clox@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          3
          ·
          4 months ago

          I’m referring to the inappropriate photography and videos out there. Please learn to read.

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        4 months ago

        It’s possible to legally photograph young people. Completely ordinary legal photographs of young people exist, from which an AI can learn the concept of what a young person looks like.

        • over_clox@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          3
          ·
          4 months ago

          The only example I can think of with what you said is just a couple brief innocent scenes from The Blue Lagoon.

          Short of that, I don’t know (nor care for any references to) any other legal public images or video of anything as such.

          I dunno, I’m just bumfuzzled how AI, whether public or private, could have sufficient information to generate such things these days.

          • FaceDeer@fedia.io
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            4 months ago

            Do a Google Image search for “child” or “teenager” or other such innocent terms, you’ll find plenty of such.

            I think you’re underestimating just how well AI is able to learn basic concepts from images. A lot of people imagine these AIs as being some sort of collage machine that pastes together little chunks of existing images, but that’s not what’s going on under the hood of modern generative art AIs. They learn the underlying concepts and characteristics of what things are, and are able to remix them conceptually.

            • over_clox@lemmy.world
              link
              fedilink
              arrow-up
              2
              arrow-down
              4
              ·
              4 months ago

              And conceptually, if I had never seen my cousin in the nude, I’d never know what young people look naked.

              No that’s not a concept, that’s a fact. AI has seen inappropriate things, and it doesn’t fully know the difference.

              You can’t blame the AI itself, but you can and should blame any and all users that have knowingly fed it bad data.