• adarza@lemmy.ca
    link
    fedilink
    English
    arrow-up
    142
    arrow-down
    2
    ·
    5 days ago
    • no account or login required.
    • it’s an addon (and one you have to go get), not baked-in.
    • limited to queries about content you’re currently looking at.
      (it’s not a general ‘search’ or queries engine)
    • llm is hosted by mozilla, not a third party.
    • session histories are not retained or shared, not even with mistral (it’s their model).
    • user interactions are not used to train.
    • Jeena@piefed.jeena.net
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      2
      ·
      5 days ago

      Thanks for the summary. So it still sends the data to a server, even if it’s Mozillas. Then I still can’t use it for work, because the data is private and they wouldn’t appreciate me sending their data toozilla.

      • KarnaOP
        link
        fedilink
        arrow-up
        21
        ·
        5 days ago

        In such scenario you need to host your choice of LLM locally.

      • LWD@lemm.ee
        link
        fedilink
        arrow-up
        11
        ·
        4 days ago

        Technically it’s a server operated by Google, leased by Mozilla. Mistral 7b could technically work locally, if Mozilla cared about doing such a thing.

        I guess you can basically use the built-in AI chatbot functionality Mozilla rushed out the door, enable a secret setting, and use Mistral locally, but what a missed opportunity from the Privacy Browser Company

      • Hamartiogonic@sopuli.xyz
        link
        fedilink
        arrow-up
        1
        arrow-down
        3
        ·
        edit-2
        4 days ago

        According to Microsoft, you can safely send your work related stuff to Copilot. Besides, most companies already use a lot of their software and cloud services, so LLM queries don’t really add very much. If you happen to be working for one of those companies, MS probably already knows what you do for a living, hosts your meeting notes, knows your calendar etc.

        If you’re working for Purism, RedHat or some other company like that, you might want to host your own LLM instead.

    • fruitycoder@sh.itjust.works
      link
      fedilink
      arrow-up
      10
      ·
      4 days ago

      That’s really cool to see. A trusted hosted open source model is really missing in the ecosystem to me. I really like the idea of web centric integration too.