I’m interested in hosting something like this, and I’d like to know experiences regarding this topic.

The main reason to host this for privacy reasons and also to integrate my own PKM data (markdown files, mainly).

Feel free to recommend me videos, articles, other Lemmy communities, etc.

  • amzd@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    You should make sure you are running a model that fits in your vram, for me it runs faster than any online LLM I’ve tried.