So this is basically another local LLM client? How is it different from the other ones?
I tested some of these open source projects. The main interesting points I identified in Jan were:
- Desktop application, not web-based. No need to install or configure anything else.
- The selection and download of the models you want to use are done entirely within the program. As easy as installing a plugin.
- User-friendly interface, keeps a history of the “conversations”.
Open-souce?
Yes.
That sounds very similar in features to LM Studio. Have you tried that one? How’s it compare?
How does a non programmer know whether to trust that offerings like LM studio are truly offline and can be trusted?
For a simple test that it is running offline, unplug your internet before chatting (after downloading the model). However if trust is a big concern, you might want to go with something popular and open source so you know third party devs have an eye on the code, which afaik LM Studio is not open source since there isn’t much code on their github.
This looks interesting. Has anyone tried this out?
Going to run it when I get the time. In a container of course
I installed it in Linux and it’s headed for a live environment.
Starling looks good so far.
One improvement I’d recommend is to make links visible. They are currently the same color as general text in the chat, black by default. I’d recommend blue.
Removed by mod
Kann es besser sein als ChatGPT?