What is currently the leading chatbot that can be self hosted on your computer?
I have heard about alot of them and it seems like everyone and there dog is making one however I’m not sure which one I should be use.
Edit : I am running a 3060 with 12GB of ram
I’d recommend koboldcpp for your backend, SillyTavern for your frontend, and I’ve been a fan of dolphin-2.1-mistral-7B. I’ve been using the Q4_K_S. But you could probably run a 13B model just fine.
I’ve heard good things about the nous-hermes models (I was a big fan of their Llama2 model). I’d stick to mistral variants, personally. Their dataset/training has far surpassed base Llama2 stuff in my opinion.
LM Studio https://lmstudio.ai/
Easiest way to get started
We have also made a ChatGPT alternative optimized for self-hosting at https://www.reddit.com/r/selfhosted/comments/187jmte/selfhosted\_alternative\_to\_chatgpt\_and\_more/
Hope you like it :)I haven’t tried running any myself so my knowledge is just from having glanced at a few discussions in AI communities when it’s come up, but I think Mistral 7B might be the current best, or a fine-tune of it such as Mistral 7B OpenOrca or Mistral 7B OpenHermes.
I think the most advanced OpenSource LLM model right now is considered to be Mistral 7B Open Orca. You can serve it via the Oobabooga GUI (which let’s you try other LLM models as well). If you don’t have a GPU for interference, this will be nothing like the ChatGPT experience though but much slower.
https://github.com/oobabooga/text-generation-webui
You can also try these models on your desktop using GPT4all, which doesn’t support GPU ATM.
Thanks for the write up- will the coral usb work for interference?
Mistral OpenOrca is a good one. I pull about 10 to 11 tokens/sec. Very impressive. For some reason though, i cannot get GPT4ALL to use my 2080ti even though it is selected in the settings.
I recommend LMSTUDIO (https://lmstudio.ai/).
It allows you to manage and download models from Hugging Face, and suggests models compatible with your machine. Additionally, it can initiate a local HTTP server that functions similarly to OpenAI’s API."
Can someone explain what is the benefit of running all of these models locally? Are they better than the free available chatgpt? Any good reading on how to learn/get started with all this?
This is ok. Best is to go to huggingface and explore. Join openllama here on reddit. They have a leaderboard too. This is one of the good ones. https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca
Are there any non-woke chat bots that will get basic facts correct such as there being only 2 genders ?
Get a life
Libtard
Oh, good one. I am overwhelmed by the intelligence and cogency of your argument.
Woketard
You’re conflating gender with sex, which is a human error. You also seem to be obsessed with this topic since youre bringing it up when it’s irrelevant. Cringe.
I can see there are libtards and woketards in here lol
The earth is round and there are only 2 genders. So radical right ?
fr fr i hate when fucking AI believes the round earth propaganda and lies instead of listening to the truth and facts
You phrased your question wrong - Reddit is too woke to respond reasonably.
And also there isn’t.
Wait …grok?
Yeah I figured. My goal is to have negative karma so I’m on the right track
Reddit was woke 12 years ago when I first used it on another account. Its only gotten way worse. Its the hard truth
I’ve only used it for 3 or 4 years. It’s always been to me. Always been heavily used to push agendas that the majority don’t agree with… But think it’s ok cause it don’t affect them…