Black Mirror creator unafraid of AI because it’s “boring”::Charlie Brooker doesn’t think AI is taking his job any time soon because it only produces trash

  • GnuLinuxDude
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 year ago

    Sam Altman (Creator of the freakish retina scanning based Worldcoin) would agree, it seems. The current path for LLMs and GPT seems to be in something of a bind, because to seriously improve upon what it currently does it needs to do something different, not more of the same. And figuring out something different could be very hard. https://www.wired.com/story/openai-ceo-sam-altman-the-age-of-giant-ai-models-is-already-over/

    At least that’s what I understand of it.

    • TheWiseAlaundo@lemmy.whynotdrs.org
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      edit-2
      1 year ago

      He’s not saying “AI is done, there’s nothing else to do, we’ve hit the limit”, he’s saying “bigger models don’t necessarily yield better results like we had initially anticipated”

      Sam recently went before congress and advocated for limiting model sizes as a means of regulation, because, at the time, he believed bigger would generally always mean better outputs. What we’re seeing now is that if a model is too large it will have trouble producing truthful output, which is super important to us humans.

      And honestly, I don’t think anyone should be shocked by this. Our own human brains have different sections that control different aspects of our lives. Why would an AI brain be different?

      • gregoryw3
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Future of AI is definitely going towards Manager/Agent model. It allows for an AI to handle all the tasks without keeping it to one model or method. We’re already seeing this with ChatGPT using Mathematica for math questions. Soon we can see art AI using different models and methods based on text input.

      • Browning@lemmings.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I gather that this is partly because data sizes haven’t been going up with model sizes. That is likely to change soon as synthetic data starts to overtake organic data in both quantity and quality.