TheBigBrother@lemmy.world to Selfhosted@lemmy.worldEnglish · edit-25 months agoWhat's the bang for the buck go to setup for AI image generation and LLM models?message-square21fedilinkarrow-up140arrow-down112file-text
arrow-up128arrow-down1message-squareWhat's the bang for the buck go to setup for AI image generation and LLM models?TheBigBrother@lemmy.world to Selfhosted@lemmy.worldEnglish · edit-25 months agomessage-square21fedilinkfile-text
minus-squarekata1yst@sh.itjust.workslinkfedilinkEnglisharrow-up4·edit-25 months agoKobaldCPP or LocalAI will probably be the easiest way out of the box that has both image generation and LLMs. I personally use vllm and HuggingChat, mostly because of vllm’s efficiency and speed increase.
minus-squareDarkThoughts@fedia.iolinkfedilinkarrow-up3·5 months agoIt is probably dead but Easy Diffusion is imo the easiest for image generation. KoboldCPP can be a bit weird here and there but was the first thing that worked for me for local text gen + gpu support.
KobaldCPP or LocalAI will probably be the easiest way out of the box that has both image generation and LLMs.
I personally use vllm and HuggingChat, mostly because of vllm’s efficiency and speed increase.
It is probably dead but Easy Diffusion is imo the easiest for image generation.
KoboldCPP can be a bit weird here and there but was the first thing that worked for me for local text gen + gpu support.