beebarfbadger@lemmy.world to Showerthoughts@lemmy.world · 9 months agoThere was a time when the entirety of the internet would have fit onto the device you're currently browsing on.message-square79fedilinkarrow-up1319arrow-down110
arrow-up1309arrow-down1message-squareThere was a time when the entirety of the internet would have fit onto the device you're currently browsing on.beebarfbadger@lemmy.world to Showerthoughts@lemmy.world · 9 months agomessage-square79fedilink
minus-squarePossibly linux@lemmy.ziplinkfedilinkEnglisharrow-up1arrow-down1·9 months agoThe Mistral language model is 3.8gb and has a crazy amount of knowledge
minus-squarePossibly linux@lemmy.ziplinkfedilinkEnglisharrow-up1·9 months agoI wouldn’t say a lot. Llama2 is way worse in my experience. Mistral gives fairly factual information. Regardless, it is still wild that 3.8gb can go so far
minus-squaretrolololol@lemmy.worldlinkfedilinkarrow-up1·9 months agoThen it will pass better on Turing test. It’s a feature.
The Mistral language model is 3.8gb and has a crazy amount of knowledge
deleted by creator
I wouldn’t say a lot. Llama2 is way worse in my experience. Mistral gives fairly factual information.
Regardless, it is still wild that 3.8gb can go so far
Then it will pass better on Turing test.
It’s a feature.