Thanks for the summary. So it still sends the data to a server, even if it’s Mozillas. Then I still can’t use it for work, because the data is private and they wouldn’t appreciate me sending their data toozilla.
Technically it’s a server operated by Google, leased by Mozilla. Mistral 7b could technically work locally, if Mozilla cared about doing such a thing.
I guess you can basically use the built-in AI chatbot functionality Mozilla rushed out the door, enable a secret setting, and use Mistral locally, but what a missed opportunity from the Privacy Browser Company
According to Microsoft, you can safely send your work related stuff to Copilot. Besides, most companies already use a lot of their software and cloud services, so LLM queries don’t really add very much. If you happen to be working for one of those companies, MS probably already knows what you do for a living, hosts your meeting notes, knows your calendar etc.
If you’re working for Purism, RedHat or some other company like that, you might want to host your own LLM instead.
That’s really cool to see. A trusted hosted open source model is really missing in the ecosystem to me. I really like the idea of web centric integration too.
(it’s not a general ‘search’ or queries engine)
Thanks for the summary. So it still sends the data to a server, even if it’s Mozillas. Then I still can’t use it for work, because the data is private and they wouldn’t appreciate me sending their data toozilla.
In such scenario you need to host your choice of LLM locally.
does the addon support usage like that?
No, but the “AI” option available on Mozilla Lab tab in settings allows you to integrate with self-hosted LLM.
I have this setup running for a while now.
Which model you are running? Who much ram?
My (docker based) configuration:
Software stack: Linux > Docker Container > Nvidia Runtime > Open WebUI > Ollama > Llama 3.1
Hardware: i5-13600K, Nvidia 3070 ti (8GB), 32 GB RAM
Docker: https://docs.docker.com/engine/install/
Nvidia Runtime for docker: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html
Open WebUI: https://docs.openwebui.com/
Ollama: https://hub.docker.com/r/ollama/ollama
Technically it’s a server operated by Google, leased by Mozilla. Mistral 7b could technically work locally, if Mozilla cared about doing such a thing.
I guess you can basically use the built-in AI chatbot functionality Mozilla rushed out the door, enable a secret setting, and use Mistral locally, but what a missed opportunity from the Privacy Browser Company
According to Microsoft, you can safely send your work related stuff to Copilot. Besides, most companies already use a lot of their software and cloud services, so LLM queries don’t really add very much. If you happen to be working for one of those companies, MS probably already knows what you do for a living, hosts your meeting notes, knows your calendar etc.
If you’re working for Purism, RedHat or some other company like that, you might want to host your own LLM instead.
That’s really cool to see. A trusted hosted open source model is really missing in the ecosystem to me. I really like the idea of web centric integration too.
deleted by creator