• DahGangalang@infosec.pub
    link
    fedilink
    arrow-up
    1
    ·
    9 months ago

    Yes? I think that depends on your specific definition and requirements of a turing machine, but I think it’s fair to compare the almagomation of cells that is me to the “AI” LLM programs of today.

    While I do think that the complexity of input, output, and “memory” of LLM AI’s is limited in current iterations (and thus makes it feel like a far comparison to “human” intelligence), I do think the underlying process is fundamentally comparable.

    The things that make me “intelligent” are just a robust set of memories, lessons, and habits that allow me to assimilate new information and experiences in a way that makes sense to (most of) the people around me. (This is abstracting away that this process is largely governed by chemical reactions, but considering consciousness appears to be just a particularly complicated chemistry problem reinforces the point I’m trying to make, I think).

    • Wheaties [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      My definition of a Turing machine? I’m not sure you know what Turing machines are. It’s a general purpose computer, described in principle. And, in principle, a computer can only carry out one task at a time. Modern computers are fast, they may have several CPUs stitched together and operating in tandem, but they are still fundamentally limited by this. Bodies don’t work like that. Every part of them is constantly reacting to it’s environment and it’s neighboring cells - concurrently.

      You are essentially saying, “Well, the hardware of the human body is very complex, and this software is(n’t quite as) complex; so the same sort of phenomenon must be taking place.” That’s absurd. You’re making a lopsided comparison between two very different physical systems. Why should the machine we built for doing sums just so happen to reproduce a phenomena we still don’t fully understand?