Taylor Swift is living every woman’s AI porn nightmare — Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.::Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.

  • books@lemmy.world
    link
    fedilink
    English
    arrow-up
    61
    arrow-down
    1
    ·
    7 months ago

    I feel like I live on the Internet and I never see this shit. Either it doesn’t exist or I exist on a completely different plane of the net.

    • GBU_28@lemm.ee
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      7 months ago

      You ever somehow get invited to a party you’d usually never be at? With a crowd you.never ever see? This is that.

    • schnurrito@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      7
      ·
      7 months ago

      On the Internet, censorship happens not by having too little information, but too much information in which it is difficult to find what you want.

      We all have only so much time to spend on the Internet and so necessarily get a filtered experience of everything that happens on the Internet.

  • BeefPiano@lemmy.world
    link
    fedilink
    English
    arrow-up
    65
    arrow-down
    7
    ·
    7 months ago

    I wonder if this winds up with revenge porn no longer being a thing? Like, if someone leaks nudes of me I can just say it’s a deepfake?

    Probably a lot of pain for women from mouth breathers before we get there from here .

    • Snot Flickerman@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      47
      arrow-down
      1
      ·
      edit-2
      7 months ago

      I mean, not much happened to protect women after The Fappening, and that happened to boatloads of famous women with lots of money, too.

      Arguably, not any billionaires, so we’ll see I guess.

    • thantik@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      7 months ago

      This has already been a thing in courts with people saying that audio of them was generated using AI. It’s here to stay, and almost nothing is going to be ‘real’ anymore unless you’ve seen it directly first-hand.

        • 800XL@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          7 months ago

          If shitty non-real person AI-generated image deformity porn with body parts like this image isn’t real, I bet it will be. There, you’re all welcome.

      • sunbeam60@lemmy.one
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        Thereby furthering erosion of our democracies and continuing the slide into Putin-confusion.

    • JoBo@feddit.uk
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      7 months ago

      Why do you think “there” is meaningfully different from “here”?

    • TwilightVulpine@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      7 months ago

      Why would it make revenge porn less of a thing? Why are so many people here convinced that as long people say it’s “fake” it’s not going to negatively affect them?

      The mouth breathers will never go away. They might even use the excuse the other way around, that because someone could say just about everything is fake, then it might be real and the victim might be lying. Remember that blurry pictures of bigfoot were enough to fool a lot of people.

      Hell, even others believe it is fake, wouldn’t it still be humilliating?

      • Æther@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        I think you’re underestimating the potential effects of an entire society starting to distrust pictures/video. Yeah a blurry Bigfoot fooled an entire generation, but nowadays most people you talk to will say it’s doctored. Scale that up to a point where literally anyone can make completely realistic pics/vids of anything in their imagination, and have it be indistinguishable from real life? I think there’s a pretty good chance that “nope, that’s a fake picture of me” will be a believable, no question response to just about anything. It’s a problem

  • Stanwich@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    30
    ·
    7 months ago

    WHAT?? DISGUSTING! WHERE WOULD THESE JERKS PUT THIS ? WHAT SPECIFIC WEBSITE DO I NEED TO BOYCOTT?

    • paddirn@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      3
      ·
      7 months ago

      Google Search didn’t really turn up much, far less than if you were to search up something like ‘Nancy Pelosi nude’ even, it kind of seems overblown and the only reason it’s gotten any news is because of who it happened to. Just being famous nowadays seems like you’re just going to see photoshopped or deepfake porn of yourself spread all over the internet.

  • ObsidianZed@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    9
    ·
    edit-2
    7 months ago

    People have been doing these for years even before AGI.

    Now, it’s just faster.

    Edit: Sorry, I suppose I should mean LLM AI

      • Joe@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        2
        ·
        7 months ago

        There is no point waiting for a response…the threat has been neutralized. Now repeat after me: There is no AGI.

      • Thomrade@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        7 months ago

        I think they might have used AGI to mean “AI Generated Images” which I’ve seen used in a few places, not know that AGI is already a term in the AI lexicon.

  • FenrirIII@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    2
    ·
    7 months ago

    Went to Bing and found “Taylor swift ai pictures” as a top search. LOTS of images of her being railed by Sesame Street characters

    • Test_Tickles@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      3
      ·
      7 months ago

      I had to add “Muppet” to the search term, and even then I only got one censored image, so I don’t know what your search history is like, but the bing algorithm definitely has some thoughts about what you like.

    • Snot Flickerman@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      2
      ·
      edit-2
      7 months ago

      Im gonna be real…

      When I stumbled upon (didn’t go looking to be clear) her and Oscar the Grouch on a pile of trash… It sent me.

      I know the whole situation is gross but I couldn’t stop laughing due to the sheer absurdity. Like… What???

  • nyan@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    5
    ·
    edit-2
    7 months ago

    Fake celebrity porn has existed since before photography, in the form of drawings and impersonators. At this point, if you’re even somewhat young and good-looking (and sometimes even if you’re not), the fake porn should be expected as part of the price you pay for fame. It isn’t as though the sort of person who gets off on this cares whether the pictures are real or not—they just need them to be close enough that they can fool themselves.

    Is it right? No, but it’s the way the world is, because humans suck.

    • BreakDecks@lemmy.ml
      link
      fedilink
      English
      arrow-up
      15
      ·
      7 months ago

      Honestly, the way I look at it is that the real offense is publishing.

      While still creepy, it would be hard to condemn someone for making fakes for personal consumption. Making an AI fake is the high-tech equivalent of gluing a cutout of your crush’s face onto a playboy centerfold. It’s hard to want to prohibit people from pretending.

      But posting those fakes online is the high-tech, scaled-up version of xeroxing the playboy centerfold with your crush’s face on it, and taping up copies all over town for everyone to see.

      Obviously, there’s a clear line people should not cross, but it’s clear that without laws to deter it, AI fakes are just going to circulate freely.

      • EatATaco@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        7 months ago

        AI fake is the high-tech equivalent of gluing a cutout of your crush’s face onto a playboy centerfold.

        At first I read that as “cousin’s face” and I was like “bru, that’s oddly specific.” Lol

  • DarkMessiah@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    7 months ago

    And this is why I don’t want to be famous. Being famous exposes your name to the crazies of the world, and leaves you blissfully unaware until the crazies snap.

    • No_Eponym@lemmy.ca
      link
      fedilink
      English
      arrow-up
      11
      ·
      7 months ago

      “… privacy is something you can sell, but you can’t buy it back.” -Bob Dylan

  • Mango@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    7
    ·
    7 months ago

    Nightmare? Doesn’t it simply give them the chance to just say any naked pic of them is fake now?

    • TwilightVulpine@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      3
      ·
      7 months ago

      Oh I’m sure that must be a very nice thing to talk out with your mother or significant other.

      “Don’t worry they are plastering naked pictures of me everywhere, it’s all fake”

        • TwilightVulpine@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 months ago

          Yeah, but it’s still humilliating for everyone involved, nevermind the additional harassment that might bring.

          Frankly, folks saying “everyone will just start assuming all porn is fake and nobody will mind it anymore” are just deluding themselves. I dunno if they want it to turn out like this so they won’t have to worry about the ethics of the matter, but that’s not how people behave.

      • Mango@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        4
        ·
        7 months ago

        Have you met my mom? She would 100% love the fact that people are trying to see her naked.

                • TwilightVulpine@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  7 months ago

                  Today? I wouldn’t say that so confidently (on Lemmy even). People working in media and marketing do have to use it. Even if they aren’t in it, it doesn’t mean they are immune to such a thing happening, or that people they know won’t stumble on it.

    • Siegfried@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 months ago

      An Argentine candidate got filmed while being allegedly stoned, so “someone” released an even worst video that was clearly fabricated by an AI to disprove the first one. It kind of worked

  • bean@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    2
    ·
    7 months ago

    Well, targeting someone famous and going overboard with it likely results in legal responses. Perhaps this gets deepfakes then attention they need to be regulated or legally punishable. Especially when targeting underage children.

  • Dariusmiles2123@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    4
    ·
    7 months ago

    At least now, if pictures are real, you can say it’s AI generated.

    Still, to be honest, I’ve never understood how some people can let one night stands film them naked.

    If it’s a longtime girlfriend or boyfriend and they betray you, it’s different, but people aren’t acting in a clever way when it comes to sex.

    • interdimensionalmeme@lemmy.ml
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      7 months ago

      There’s nothing wrong with recording your naked body and it being seen online by willing persons.

      The people who would disrespect you for it, they’re the problem.

      • Dariusmiles2123@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        7 months ago

        That’s not what I’m talking about.

        I’m talking about not being careful who you’re giving these images if you don’t want them to spread online. And, of course, the person sharing it on the web is the guilty person, not the naked victim.

        • TwilightVulpine@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 months ago

          Well, this very situation shows one can be as careful as they could and they might still have porn of themselves spread everywhere.

    • stown@sedd.it
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 months ago

      Well, you don’t have that many brain cells in the areas people are doing that thinking.

  • EatATaco@lemm.ee
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    7 months ago

    God what a garbage article:

    On X—which used to be called Twitter before it was bought by billionaire edgelord Elon Musk

    I mean, really? The guy makes my skin crawl, but what a hypocritically edgy comment to put into an article.

    And then zero comment from Taylor Swift in it at all. She is basically just speaking for her. Not only that, but she anoints herself spokesperson for all women…while also pretty conspicuously ignoring that men can be victims of this too.

    Don’t get me wrong, I’m not defending non consensual ai porn in the least, and I assume the author and I are mostly in agreement about the need for something to be done about it.

    But it’s trashy politically charged and biased articles like this that make people take sides on things like this. Imo, the author is contributing to the problems of society she probably wants to fix.

    • jivandabeast@lemmy.browntown.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      7 months ago

      hypocritically edgy comment to put into an article.

      Its vice, their whole brand is edgy. Calling Elon an edgelord is very on brand for them.

      pretty conspicuously ignoring that men can be victims of this too.

      Sure, but women are disproportionately affected by this. You’re making the “all lives matter” argument of AI porn

      make people take sides on things like this

      People should be taking sides on this.

      Just seems like you wanna get mad for no reason? I read the article, and it doesn’t come across nearly as bad as you would lead anyone to believe. This article is about deepfake pornography as a whole, and how is can (and more importantly HAS) affected women, including minors. Sure it would have been nice to have a comment from Taylor, but i really don’t think it was necessary.

      • EatATaco@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        7 months ago

        Its vice, their whole brand is edgy. Calling Elon an edgelord is very on brand for them.

        I’ve come across this source before and don’t recall being so turned off by the tone. If this is on brand for them, then my criticism is not limited to the author.

        Sure, but women are disproportionately affected by this. You’re making the “all lives matter” argument of AI porn

        You have a point, but I disagree. Black lives matter is effectively saying that black lives currently don’t matter (mainly when it comes to policing). All lives matter is being dismissive of that claim because no one really believes that white lives don’t matter to police. Pointing to the fact that there are male victims too is not dismissive of the fact that women are the primary victims of this. It’s almost the opposite and ignoring males is being dismissive of victims.

        People should be taking sides on this.

        Sorry, wasn’t clear on that point. What I was saying here is this will make people take sides based on their politics rather than on the merits of whether it’s wrong in and of itself.

        i really don’t think it was necessary.

        Neither was her speaking for swift, nor all of women kind, nor only making it about women, nor calling musk an edge lord. You seem to be making the same argument as me.

    • Powerpoint@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      I disagree. To pretend nothing is wrong is worse. The author was accurate in their description here.

      • EatATaco@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        7 months ago

        This is the second poster here who can’t seem to understand that there is a whole world of things between “pretending nothing is wrong” and acting like a child by calling people “edge lord.”

        Last time I checked, on my front page, there was an article from the NY times about how x is spreading misinformation and musk seems to be part of it. yet they managed to point out this problem without using the term edge lord. Is this shocking to you?

    • Psythik@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      On the contrary, I find it more ridiculous when news media pretends like nothing is wrong over at Twitter HQ. I wish more journalists would call Musk out like this every time they’re forced to mention Twitter.

      • EatATaco@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        7 months ago

        Can you really see nothing other than “pretending nothing is wrong” and “calling musk an edge lord?”

        I see the media calling out the faults regularly regularly without needing to act like …well, an edge lord.

        • Psythik@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 months ago

          Professionalism was thrown out the window the moment orange man became president. The Republicans play dirty, so everyone else has to as well, or else they’ll walk all over us. Taking the high ground is a dead concept.

          • EatATaco@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            7 months ago

            I strongly disagree, but this is completely unrelated to what I said.

  • Supermariofan67@programming.dev
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    3
    ·
    7 months ago

    I’m not really sure what to think of this. On one hand, the way I see it, AI deep fakes are essentially a form of defamation, and can harm people by in a way being a false rumour about their sexual life. However, public figures are subject to a much higher standard for defamation, and for a very good reason, else there would be a strong chilling effect on satire, parody, and criticism.

    In general I think that deepfakes are only wrong (defamatory) if a reasonable person couldn’t easily distinguish them from reality, so obvious fake stuff doesn’t count. But for those that are, where is the line drawn for public figures? It is unfortunate that many people can’t choose whether to become a public figure, but it is essential to a functional society that freedom of the press and free expression be lenient when it comes to satirical, critical, creative, and even indecent works related to them. But this is of course not absolute.

    • Imgonnatrythis@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      2
      ·
      7 months ago

      According to the article it’s impossible, you will probably see these on a billboard on your way into work. According to real life if you spend several hours looking for them you might find them on some 4chan sub somewhere. If you don’t know to avoid corners of the internet like that already, there isn’t any hope.

    • devfuuu@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      7 months ago

      Don’t be a woman. That’s the only reasonable way to exist. The world is against them and will do anything it can to keep them as toys.

  • Nora@lemmy.ml
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    34
    ·
    7 months ago

    I feel like this is amazing. Everyone can get any porn they want and no one gets hurt? And if nudes of anyone are ever “leaked” you could just say they’re AI generated. It’s like a win win win.