• xmunk@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    Cool, how would it know what a naked young person looks like? Naked adults look significantly different.

      • xmunk@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        arrow-down
        3
        ·
        3 months ago

        Is a kid just a 60% reduction by volume of an adult? And these are generative algorithms… nobody really understands how it perceives the world and word relations.

        • FaceDeer@fedia.io
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          3 months ago

          It understands young and old. That means it knows a kid is not just a 60% reduction by volume of an adult.

          We know it understands these sorts of things because of the very things this whole kerfuffle is about - it’s able to generate images of things that weren’t explicitly in its training set.

          • xmunk@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            3 months ago

            But it doesn’t fully understand young and “naked young person” isn’t just a scaled down “naked adult”. There are physiological changes that people go through during puberty which is why the “It understands young vs. old” is a clearly vapid and low effort comment. Yours has more meaning behind it so I’d clarify that just being able to have a vague understanding of young and old doesn’t mean it can generate CSAM.

        • GBU_28@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          3 months ago

          Just go ask a model to show you, with legal subject matter