Modern AI data centers consume enormous amounts of power, and it looks like they will get even more power-hungry in the coming years as companies like Google, Microsoft, Meta, and OpenAI strive towards artificial general intelligence (AGI). Oracle has already outlined plans to use nuclear power plants for its 1-gigawatt datacenters. It looks like Microsoft plans to do the same as it just inked a deal to restart a nuclear power plant to feed its data centers, reports Bloomberg.
That’s one groups opinion, we still see improving LLMs I’m sure they will continue to improve and be adapted for whatever future use we need them. I mean I personally find them great in their current state for what I use them for
What skin do you have in this game? Leading industry experts, who btw want to SELL IT TO YOU, told you it has hit a ceiling. Why do you refute it so much? Let it die, we will all be better off.
I use them regularly for personal and work projects, they work great at outlining what I need to do in a project as well as identifying oversights in my project. If industry experts are saying this, then why are there still improvements being made, why are they still providing value to people, just because you don’t use them doesn’t mean they aren’t useful.
Even if it didn’t improve further there are still uses for LLMs we have today. That’s only one kind of AI as well, the kind that makes all the images and videos is completely separate. That has come on a long way too.
I made this chart for you:
------ Expectations for AI
----- LLM’s actual usefulness
----- What I think if it
----- LLM’ usefulness after accounting for costs