I am a teacher and I have a LOT of different literature material that I wish to study, and play around with.

I wish to have a self-hosted and reasonably smart LLM into which I can feed all the textual material I have generated over the years. I would be interested to see if this model can answer some of my subjective course questions that I have set over my exams, or write small paragraphs about the topic I teach.

In terms of hardware, I have an old Lenovo laptop with an NVIDIA graphics card.

P.S: I am not technically very experienced. I run Linux and can do very basic stuff. Never self hosted anything other than LibreTranslate and a pihole!

  • applepie@kbin.social
    link
    fedilink
    arrow-up
    0
    arrow-down
    5
    ·
    7 months ago

    You would need 24gb vram card to even start this thing up. Prolly would yield shiti results

    • Bipta@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      7 months ago

      They didn’t even mention a specific model. Why would you say they need 24gb to run any model? That’s just not true.

      • applepie@kbin.social
        link
        fedilink
        arrow-up
        2
        arrow-down
        3
        ·
        7 months ago

        I didnt say any. Based on what he is asking, he can’t just run this shit on an old laptop.