• MightEnlightenYou@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    11 months ago

    I am actually hoping for AGI to take over the world but in a good way. It’s just that I worry about the risk of it being misaligned with “human goals” (whatever that means). Skynet seems a bit absurd but the paperclip maximizer scenario doesn’t seem completely unlikely.

    • mrbaby@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      11 months ago

      Human goals are usually pretty terrible. Become the wealthiest subset of humans. Eradicate some subset of humans. Force all other humans to align with a subset of humans. I guess cure diseases sometimes. And some subsets probably fuck.

      We need an adult.