Taylor Swift is living every woman’s AI porn nightmare — Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.::Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.

  • Mango@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    7
    ·
    10 months ago

    Nightmare? Doesn’t it simply give them the chance to just say any naked pic of them is fake now?

    • TwilightVulpine@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      3
      ·
      10 months ago

      Oh I’m sure that must be a very nice thing to talk out with your mother or significant other.

      “Don’t worry they are plastering naked pictures of me everywhere, it’s all fake”

        • TwilightVulpine@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          Yeah, but it’s still humilliating for everyone involved, nevermind the additional harassment that might bring.

          Frankly, folks saying “everyone will just start assuming all porn is fake and nobody will mind it anymore” are just deluding themselves. I dunno if they want it to turn out like this so they won’t have to worry about the ethics of the matter, but that’s not how people behave.

      • Mango@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        4
        ·
        10 months ago

        Have you met my mom? She would 100% love the fact that people are trying to see her naked.

                • TwilightVulpine@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  10 months ago

                  Today? I wouldn’t say that so confidently (on Lemmy even). People working in media and marketing do have to use it. Even if they aren’t in it, it doesn’t mean they are immune to such a thing happening, or that people they know won’t stumble on it.

                  • Mango@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    10 months ago

                    This people wouldn’t have personal contacts there. It’s besides the point.

    • Siegfried@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 months ago

      An Argentine candidate got filmed while being allegedly stoned, so “someone” released an even worst video that was clearly fabricated by an AI to disprove the first one. It kind of worked