• ☆ Yσɠƚԋσʂ ☆@lemmygrad.mlOP
    link
    fedilink
    English
    arrow-up
    5
    ·
    9 hours ago

    That’s what OpenAI thought originally when they started working on ChatGPT5, they figured they’d just make the model bigger and it’s going to do more. Turns out that making the model bigger doesn’t actually produce better results. We’re also at a point now where most of the publicly available information has been scraped as well. Now the focus is turning towards improving algorithms for making sense of the data as opposed to just stuffing more data into the model. And this is a problem for Nvidia because current generation of chips is already good enough for doing this.

    Of course, people will find ways to utilize more processing power as is always the case. But at least in the near term, this is no longer the bottleneck.