Stamets@lemmy.world to White People Twitter@sh.itjust.works · 6 months agoThe dreamlemmy.worldimagemessage-square170fedilinkarrow-up11.91Karrow-down138
arrow-up11.87Karrow-down1imageThe dreamlemmy.worldStamets@lemmy.world to White People Twitter@sh.itjust.works · 6 months agomessage-square170fedilink
minus-squareCeeBee@lemmy.worldlinkfedilinkarrow-up2·6 months ago I don’t know of an LLM that works decently on personal hardware Ollama with ollama-webui. Models like solar-10.7b and mistral-7b work nicely on local hardware. Solar 10.7b should work well on a card with 8GB of vram.
Ollama with ollama-webui. Models like solar-10.7b and mistral-7b work nicely on local hardware. Solar 10.7b should work well on a card with 8GB of vram.