wuphysics87 to Privacy · 16 hours agoCan you trust locally run LLMs?message-squaremessage-square20fedilinkarrow-up157arrow-down15file-text
arrow-up152arrow-down1message-squareCan you trust locally run LLMs?wuphysics87 to Privacy · 16 hours agomessage-square20fedilinkfile-text
I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?
minus-squareforemanguylinkfedilinkarrow-up4·6 hours agoThe only real way of checking is by checking the send packets and/or inspect the source code. This “problem” is not only related to local AI but open source are too
The only real way of checking is by checking the send packets and/or inspect the source code. This “problem” is not only related to local AI but open source are too