wuphysics87 to Privacy · 15 hours agoCan you trust locally run LLMs?message-squaremessage-square20fedilinkarrow-up157arrow-down15file-text
arrow-up152arrow-down1message-squareCan you trust locally run LLMs?wuphysics87 to Privacy · 15 hours agomessage-square20fedilinkfile-text
I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?
minus-squareEmberleaflinkfedilinkarrow-up4·5 hours agoI found this and actually like it better than Alpaca. Your mileage may vary, but they claim that it’s 100% private, no internet necessary: GPT4All
minus-squarestink@lemmygrad.mllinkfedilinkEnglisharrow-up1·2 hours agoIt’s nice but it’s hard to load unsupported models sadly. Really wish you could easily sideload but it’s nice unless you have a niche usecase.
I found this and actually like it better than Alpaca. Your mileage may vary, but they claim that it’s 100% private, no internet necessary:
GPT4All
It’s nice but it’s hard to load unsupported models sadly. Really wish you could easily sideload but it’s nice unless you have a niche usecase.