I’m sitting here really hoping that models hit a plateau in capabilities soon. Continuing to get smaller/more efficient would be great, but if the capabilities of our best models would plateau for a bit and give society time to adjust to the impact I would be very happy.
We’re already seeing a slight leveling off compared to what we had previously. Right now there is a strong focus on optimization, getting models that can run on-device without losing too much quality. This will both help make LLMs sustainable financially and energy-wise, as well as mitigate the privacy and security concerns inherent to the first wave of cloud-based LLMs.
It’s certainly not moving as fast as their promises (what ever does), and perhaps has slowed, but for me at least it’s too early to call a plateau. Perhaps someone who works in the field or follows more closely can provide a better characterization, though.
I’m sitting here really hoping that models hit a plateau in capabilities soon. Continuing to get smaller/more efficient would be great, but if the capabilities of our best models would plateau for a bit and give society time to adjust to the impact I would be very happy.
We’re already seeing a slight leveling off compared to what we had previously. Right now there is a strong focus on optimization, getting models that can run on-device without losing too much quality. This will both help make LLMs sustainable financially and energy-wise, as well as mitigate the privacy and security concerns inherent to the first wave of cloud-based LLMs.
Hasn’t it already kind of p;plateaued, or slowed at least. They where promising the world and it seems to have kind of halted.
It’s certainly not moving as fast as their promises (what ever does), and perhaps has slowed, but for me at least it’s too early to call a plateau. Perhaps someone who works in the field or follows more closely can provide a better characterization, though.
deleted by creator