Writing it off as a fad is rather ignorant. Sure there’s a lot of hype and bullshit surrounding AI, but it already has strong use cases and earns a profit for the companies involved. It’s still the early years for the tech too, so it is reasonable to expect it to improve in the coming years, both in terms of accuracy and performance.
I don’t disagree with the idea that AI is being shoved into software without much purpose or thought, but that’s got little to do with whether it is here to stay or not. It’s here to stay for its many practical uses, be it new personal assistants like what Apple has shown off with Siri, text summarization like what is being added to browsers, rephrasing/tone checking like what is being added to office software, or code completion and debugging like what is being added to code editors. These applications have proved their worth, and even if some applications are just because hype, these applications are here to stay.
I’m sitting here really hoping that models hit a plateau in capabilities soon. Continuing to get smaller/more efficient would be great, but if the capabilities of our best models would plateau for a bit and give society time to adjust to the impact I would be very happy.
We’re already seeing a slight leveling off compared to what we had previously. Right now there is a strong focus on optimization, getting models that can run on-device without losing too much quality. This will both help make LLMs sustainable financially and energy-wise, as well as mitigate the privacy and security concerns inherent to the first wave of cloud-based LLMs.
It’s certainly not moving as fast as their promises (what ever does), and perhaps has slowed, but for me at least it’s too early to call a plateau. Perhaps someone who works in the field or follows more closely can provide a better characterization, though.
Writing it off as a fad is rather ignorant. Sure there’s a lot of hype and bullshit surrounding AI, but it already has strong use cases and earns a profit for the companies involved. It’s still the early years for the tech too, so it is reasonable to expect it to improve in the coming years, both in terms of accuracy and performance.
deleted by creator
I don’t disagree with the idea that AI is being shoved into software without much purpose or thought, but that’s got little to do with whether it is here to stay or not. It’s here to stay for its many practical uses, be it new personal assistants like what Apple has shown off with Siri, text summarization like what is being added to browsers, rephrasing/tone checking like what is being added to office software, or code completion and debugging like what is being added to code editors. These applications have proved their worth, and even if some applications are just because hype, these applications are here to stay.
deleted by creator
It’s still a fad. Sure it has some uses but most of it is hype right now and they throw AI onto everything.
I’m sitting here really hoping that models hit a plateau in capabilities soon. Continuing to get smaller/more efficient would be great, but if the capabilities of our best models would plateau for a bit and give society time to adjust to the impact I would be very happy.
We’re already seeing a slight leveling off compared to what we had previously. Right now there is a strong focus on optimization, getting models that can run on-device without losing too much quality. This will both help make LLMs sustainable financially and energy-wise, as well as mitigate the privacy and security concerns inherent to the first wave of cloud-based LLMs.
deleted by creator
Hasn’t it already kind of p;plateaued, or slowed at least. They where promising the world and it seems to have kind of halted.
It’s certainly not moving as fast as their promises (what ever does), and perhaps has slowed, but for me at least it’s too early to call a plateau. Perhaps someone who works in the field or follows more closely can provide a better characterization, though.
I am still managing, but feel I’ll have to pay premium to avoid that shit soon.