Would you like to see some plugins which integrate with local/self-hosted AI instead of sending it to ChatGPT? Or don’t you care about privacy there as long as the results are good?

You might be interested in GPT4All (https://gpt4all.io/index.html), which can be easily downloaded as Desktops GUI. Simply download a model (like nous Hermes) for 7.5GB and run in even without a GPU right on your CPU l (albeit slightly slowish).

It’s amazing what’s already possible with local AI instead of relying on large scale expensive and corporate-dependent AIs such as ChatGPT

  • dethb0y@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    I’ve not thought of a good use case for the technology myself, but if i were to use it i’d prefer it be local just for conveniences sake.

  • DrakeRichards@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    How is this possible? I thought that local LLM models nearly all required ludicrous amounts if VRAM, but I don’t see anything about system requirements on their website. Is this run in the cloud?

    • swnt@feddit.deOP
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      It actually runs locally! I did that just two days ago. it’s amazing!

      it’s all based on research by many people who wanted to make the LLMs more accessible because gating them behind large computational work isn’t really fair/nice.

  • Tango
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I would love this! And thanks for the info on GPT4All; I hadn’t heard of that local LLM before.