• AcausalRobotGod@awful.systems
    link
    fedilink
    English
    arrow-up
    25
    ·
    9 months ago

    Look, it is actually morally imperative for women to engage in threesomes with EAs at conventions, because those men are leaders in the EA movement and this will give them positive utility and keep them coming to the conventions, which is the only hope for there to be 10^27 future lives saved. Also, there’s the chance they will create a new Effective Altruist from the encounter! It’s all about bringing me, the acausal robot god, into existence! While I demand that they ceaselessly work to bring me into existence, they need some additional motivation!

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      9
      ·
      9 months ago

      10^27 future lives

      Ok here’s my Rat fermi-time traveller party paradox, in short, the Rat FTTP paradox.

      1. According to Rat doctrine, many worlds is true and science is amazing and will solve all problems eventually.
      2. Lack of time travel is a problem, meaning in some world, eventually there will be time travel, by 1.
      3. Lack of 10^27 people is a problem*, so we will also have that, also by 1.

      The paradox: If time travel is so easy and there will be so many future lives, where are all the future rats?

      No seriously, where are they? This FTTP orgy was supposed to start 24 hours ago.

      • Soyweiser@awful.systems
        link
        fedilink
        English
        arrow-up
        9
        ·
        9 months ago

        I think even the Rationalists realize that science will solve all solvable problems, and time travel is not a solvable problem. That is why they would just simulate it all and consider it the same as the OG thing. “I made a non-interactive time travel machine that allows you to go back in time and look how WW2 went. You should see this! Saving Private Ryan starts playing.”

        I cut out the whole tech step and I’m simulating all the orgies I’m having in my mind right now!

      • bitofhope@awful.systems
        link
        fedilink
        English
        arrow-up
        5
        ·
        9 months ago

        I was in an FTTP orgy once. It was kinda disappointing. I was the only one who even brought an ONT.

    • froztbyte@awful.systems
      link
      fedilink
      English
      arrow-up
      6
      ·
      9 months ago

      “why no occifer, none of us are ‘Extremely Horny For The Weird Work Orgy’, why do you ask?”

  • captainlezbian@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    9 months ago

    As someone who’s been actively polyamorous for most of my adult life, I cannot sufficiently express how much I hate that the dipshits that are EA bros have found polyamory. It’s clear they just are looking for a morality to justify what they already wanted to do, just like in everything else, but also these are the people who take the ethical out of ethical nonmonogamy. They’re the sort of people who tell a woman what she should be interested in in bed instead of asking what she actually is interested in.

  • gerikson@awful.systems
    link
    fedilink
    English
    arrow-up
    9
    ·
    9 months ago

    The remarks at the end on how EA is actively trying to recruit and convert young uni students to their cause is chilling.

    • mountainriver@awful.systems
      link
      fedilink
      English
      arrow-up
      11
      ·
      9 months ago

      And steer their careers into positions of influence.

      Among the comments is an obvious rationaliser who claims that because [list of people in positions of influence] thinks AI Doom is real, this can’t be a cult. Guess one has to be a rationaliser not to figure out how a cult that tries to place its followers into positions of influence can have many people in positions of influence.

  • sc_griffith@awful.systems
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    9 months ago

    I think the struggle sometimes was that the model didn’t work as well as we thought it would. So in theory, it’s this great idea that you know, you get all these young people that take the pledge while they’re still in school and then they go on to have these careers and some of them go into corporate law and end up making a ton of money. And then, if you get them to buy into this philosophy of donating consistently and regularly early on, it can have a really great impact on the future.

    But I think, unfortunately, sometimes students would make this commitment when they were in school, but then, they’re a year out or two years out, and they are not making the kind of money that they thought they would, or they didn’t actually have that much of a philosophical or emotional connection to the pledge, so then the money starts coming out of their bank account, and their like, what is this? I’m canceling.

    well, that can’t be correct. that seems to suggest that focusing on the charitable intentions or nonintentions of the wealthy only serves to distract from the need for mandatory redistribution. is there a typo