Karna to Firefox · 5 days agoMeet Orbit, Mozilla's AI Assistant Extension for Firefoxwww.omgubuntu.co.ukexternal-linkmessage-square53fedilinkarrow-up1128arrow-down153
arrow-up175arrow-down1external-linkMeet Orbit, Mozilla's AI Assistant Extension for Firefoxwww.omgubuntu.co.ukKarna to Firefox · 5 days agomessage-square53fedilink
minus-squareKarnaOPlinkfedilinkarrow-up21·5 days agoIn such scenario you need to host your choice of LLM locally.
minus-squareReversalHatchery@beehaw.orglinkfedilinkEnglisharrow-up5·5 days agodoes the addon support usage like that?
minus-squareKarnaOPlinkfedilinkarrow-up7·5 days agoNo, but the “AI” option available on Mozilla Lab tab in settings allows you to integrate with self-hosted LLM. I have this setup running for a while now.
minus-squarecmgvd3lw@discuss.tchncs.delinkfedilinkarrow-up4·4 days agoWhich model you are running? Who much ram?
minus-squareKarnaOPlinkfedilinkarrow-up4·edit-24 days agoMy (docker based) configuration: Software stack: Linux > Docker Container > Nvidia Runtime > Open WebUI > Ollama > Llama 3.1 Hardware: i5-13600K, Nvidia 3070 ti (8GB), 32 GB RAM Docker: https://docs.docker.com/engine/install/ Nvidia Runtime for docker: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html Open WebUI: https://docs.openwebui.com/ Ollama: https://hub.docker.com/r/ollama/ollama
In such scenario you need to host your choice of LLM locally.
does the addon support usage like that?
No, but the “AI” option available on Mozilla Lab tab in settings allows you to integrate with self-hosted LLM.
I have this setup running for a while now.
Which model you are running? Who much ram?
My (docker based) configuration:
Software stack: Linux > Docker Container > Nvidia Runtime > Open WebUI > Ollama > Llama 3.1
Hardware: i5-13600K, Nvidia 3070 ti (8GB), 32 GB RAM
Docker: https://docs.docker.com/engine/install/
Nvidia Runtime for docker: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html
Open WebUI: https://docs.openwebui.com/
Ollama: https://hub.docker.com/r/ollama/ollama