potentiallynotfelix@lemmy.fish to FirefoxEnglish · 14 hours agoFirefox introduces AI as experimental featurelemmy.fishimagemessage-square43fedilinkarrow-up1110arrow-down18file-text
arrow-up1102arrow-down1imageFirefox introduces AI as experimental featurelemmy.fishpotentiallynotfelix@lemmy.fish to FirefoxEnglish · 14 hours agomessage-square43fedilinkfile-text
minus-squareTheMachineStops@discuss.tchncs.delinkfedilinkarrow-up2·edit-25 hours agoIt gives you many options on what to use, you can use Llama which is offline. Needs to be enabled though about:config > browser.ml.chat.hideLocalhost.
minus-squareSwedneck@discuss.tchncs.delinkfedilinkarrow-up2·3 hours agoand thus is unavailable to anyone who isn’t a power user, as they will never see a comment like this and about:config would fill them with dread
minus-squareTheMachineStops@discuss.tchncs.delinkfedilinkarrow-up2·edit-228 minutes agoLol, that is certainly true and you would need to also set it up manually which even power users might not be able to do. Thankfully there is an easy to follow guide here: https://ai-guide.future.mozilla.org/content/running-llms-locally/.
It gives you many options on what to use, you can use Llama which is offline. Needs to be enabled though about:config > browser.ml.chat.hideLocalhost.
and thus is unavailable to anyone who isn’t a power user, as they will never see a comment like this and about:config would fill them with dread
Lol, that is certainly true and you would need to also set it up manually which even power users might not be able to do. Thankfully there is an easy to follow guide here: https://ai-guide.future.mozilla.org/content/running-llms-locally/.