• ☆ Yσɠƚԋσʂ ☆OP
    link
    21 year ago

    Yeah, this has to be pretty limited compared to actual ChatGPT. RAM required for the scale of models for GPT3/4 are well beyond consumer hardware. 13B parameter model takes about 8gb, and GPT3/4 use around 165B as I recall.

    I think the way to get something comparable would be using an approach like Bloom where you have a torrent style distributed system effectively creating a crowd sourced supercomputer.