• DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      3
      ·
      2 months ago

      I could see use for local text gen, but that apparently eats quite a bit more than what desktop PCs could offer if you want to have some actually good results & speed. Generally though, I’d rather want separate extension cards for this. Making it part of other processors is just going to increase their price, even for those who have no use for it.

        • DarkThoughts@fedia.io
          link
          fedilink
          arrow-up
          2
          ·
          2 months ago

          Yes, I know - that’s my point. But you need the necessary hardware to run those models in a performative way. Waiting a minute to produce some vaguely relevant gibberish is not going to be of much use. You could also use generative text for other applications, such as video game NPCs, especially all those otherwise useless drones you see in a lot of open world titles could gain a lot of depth.