• steph@lemmy.clueware.org
    link
    fedilink
    arrow-up
    12
    ·
    1 year ago

    Given this trend, GPT 5 or 6 will be trained on a majority of content from its previous versions, modeling them instead of the expect full range of language. Researchers have already tested the outcome of a model-in-loop with pictures, it was not pretty.

    • theludditeOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Yeah absolutely. The Luddite had a guest write in and suggest that if anxiety is the self turned inwards,nthe internet is going to be full of increasingly anxious LLMs in a few years. I really liked that way of putting it.