- cross-posted to:
- opensource
- cross-posted to:
- opensource
I’ve been building MinimalChat for a while now, and based on the feedback I’ve received, it’s in a pretty decent place for general use. I figured I’d share it here for anyone who might be interested!
Quick Features Overview:
- Mobile PWA Support: Install the site like a normal app on any device.
- Any OpenAI formatted API support: Works with LM Studio, OpenRouter, etc.
- Local Storage: All data is stored locally in the browser with minimal setup. Just enter a port and go in Docker.
- Experimental Conversational Mode (GPT Models for now)
- Basic File Upload and Storage Support: Files are stored locally in the browser.
- Vision Support with Maintained Context
- Regen/Edit Previous User Messages
- Swap Models Anytime: Maintain conversational context while switching models.
- Set/Save System Prompts: Set the system prompt. Prompts will also be saved to a list so they can be switched between easily.
The idea is to make it essentially foolproof to deploy or set up while being generally full-featured and aesthetically pleasing. No additional databases or servers are needed, everything is contained and managed inside the web app itself locally.
It’s another chat client in a sea of clients but it is unique in its own ways in my opinion. Enjoy! Feedback is always appreciated!
Self Hosting Wiki Section https://github.com/fingerthief/minimal-chat/wiki/Self-Hosting-With-Docker
Will it work with Ollama?
I haven’t personally tried it yet with Ollama but it should work since it looks like Ollama has the ability to use OpenAI Response Formatted API https://github.com/ollama/ollama/blob/main/docs/openai.md
I might give it go here in a bit to test and confirm.