• itslilith@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    19
    ·
    7 months ago

    It’s static, yes, but the static price is orders of magnitude higher. It still involves loading the whole model into VRAM and performing matrix multiplication on trillions of numbers

    • etrotta@beehaw.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 months ago

      To be fair, I wouldn’t include “loading the whole model into VRAM” as part of the cost, given they can just keep it in there between different requests, and it might be down to hundreds of billions or dozens of billions instead of trillions… but even after all improvements it should still be orders of magnitude more expensive than normal search, which just makes their decision even crazier

    • pup_atlas@pawb.social
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      Indexing and lookups on datasets as big as companies like Google and Amazon are running also take trillions of operations to complete, especially when you take into account the constant reindexing that needs to be done. In some cases, encoding data into a neural network is actually cheaper than storing the data itself. You can see this in practice with gaussian splatting point cloud capture, where they are training networks to guide points in the cloud at runtime, rather than storing the position of trillions of points over time.