• Amaltheamannen
    link
    fedilink
    arrow-up
    17
    ·
    1 year ago

    Check out /r/localllama. Preferably you need a Nvidia you with >= 24 GB VRAM but it also works with a cpu and loads of normal RAM, if you can wait a minute or two for a lengthy answer. Loads of models to choose from, many with no censorship at all. Won’t be as good as chatgptv4, but many are close to gpt3.

    • quo@feddit.uk
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I understand why a graphics card and a lot of VRAM would be important for AI like stable diffusion, why does this spec matter for language models too that don’t use graphics?

      • Amaltheamannen
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 year ago

        They have a lot of fast memory and are great at doing things in parallel. Most AI are just operations on matrixes, which essentially is what a GPU is built for.

      • averagedrunk
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        GPUs are great for parallel tasks. Computing answers requires a lot of parallel tasks. CPUs are amazing for doing one thing at a time.