I think AI is neat.

  • Daft_ish@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    11 months ago

    I guess, I just am looking at from an end user vantage point. I’m not saying the model cant understand the words its using. I just don’t think it currently understands that specific words refer to real life objects and there are laws of physics that apply to those specific objects and how they interact with each other.

    Like saying there is a guy that exists and is a historical figure means that information is independently verified by physical objects that exist in the world.

    • Adalast@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      11 months ago

      In some ways, you are correct. It is coming though. The psychological/neurological word you are searching for is “conceptualization”. The AI models lack the ability to abstract the text they know into the abstract ideas of the objects, at least in the same way humans do. Technically the ability to say “show me a chair” and it returns images of a chair, then following up with “show me things related to the last thing you showed me” and it shows couches, butts, tables, etc. is a conceptual abstraction of a sort. The issue comes when you ask “why are those things related to the first thing?” It is coming, but it will be a little while before it is able to describe the abstraction it just did, but it is capable of the first stage at least.