• Buttons@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    You apply a reductionist view to LLMs that you do not apply to humans.

    LLMs receive words and produce the next word. Humans receive stimulus from their senses and produce muscle movements.

    LLMs are in their infancy, but I’m not convinced their “core loop”, so to speak, is any more basic than our own.

    In the world of text: text in -> word out

    In the physical word: sense stimulation in -> muscle movement out

    There’s nothing more to it than that, right?

    Well, actually there is more to it than that, we have to look at these things on a higher level. If we believe that humans are more than sense stimulation and muscle movements, then we should also be willing to believe that LLMs are more than just a loop producing one word at a time. We need to assess both at the same level of abstraction.

    • Veraticus@lib.lgbtOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      They have no core loop. You are anthropomorphizing them. They are literally no more self-directed than a calculator, and have no more of a “core loop” than a calculator does.

      Do you believe humans are simply very advanced and very complicated calculators? I think most people would say “no.” While humans can do mathematics, we are different entirely to calculators. We experience sentience; thoughts, feelings, emotions, rationality. None of the devices we’ve ever built, no matter how clever, has any of those things: and neither do LLMs.

      If you do think humans are as deterministic as a calculator then I guess I don’t know what to tell you other than I disagree. Other people actually exist and have internal realities. LLMs don’t. That’s the difference.