• M500
    link
    fedilink
    arrow-up
    2
    ·
    8 months ago

    ChatGPT 3.5 and Google Gemini mostly. Someone Microsoft copilot.

    I mostly choose the one that’s most convenient in that moment. I don’t ever measure performance or capability.

    My local home server doesn’t have a gpu yet, but I might get one to run some models locally, but I’m not sure whether I care to spend the money to do that or not.

    • KidnappedByKitties@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      8 months ago

      Wow, either you are much more skilled than anyone I know at these tasks, and/or you work in a very different way. I tried for two weeks to figure out how to get something useful out of it, but got only garbage.

      It was very good at generic text, much less so at concise, insightful, technical, or argumentative text, which is most of what I sell.

    • A_Very_Big_Fan@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      8 months ago

      There are some models you can use on as little hardware as a RaspberryPi. I’m willing to bet there’s a pre-trained model out there that suits your needs with whatever hardware you have. Could be worth a try