“There’s no way to get there without a breakthrough,” OpenAI CEO Sam Altman said, arguing that AI will soon need even more energy.
This is the best summary I could come up with:
The process is ludicrously energy intensive, with experts estimating that the industry could soon suck up as much electricity as an entire country.
Unperturbed, billionaires including Jeff Bezos, Peter Thiel and Bill Gates have poured substantial amounts of money into the idea.
However, while the emergent crop of startups like Helion has repeatedly claimed that fusion energy is right around the corner, we have yet to see any concrete results.
Of course, if Altman’s rosy vision of the future of energy production were to turn into a reality, we’d have a considerably greener way to power these AI models.
According to an October paper published in the journal Joule, adding generative AI to Google Search alone balloons its energy uses by more than tenfold.
“Let’s not make a new model to improve only its accuracy and speed,” University of Florence assistant professor Roberto Verdecchia told the New York Times.
The original article contains 525 words, the summary contains 149 words. Saved 72%. I’m a bot and I’m open source!
Darn
pocket nuke plants… have to be the stopgap between here and fusion. are there still people working on those car-sized nuke plants for a more distributed system?
Is the answer people? I think I’ve seen this movie before.
The human brain uses about 20W. Maybe AI needs to be more efficient instead?
That would require a revolutionary discovery in material science and hardware.
And yet we have brains. This brute force approach to machine learning is quite effective but has problems scaling. So, new energy sources or new thinking?
Perfect let’s use human brains as CPUs then. Not the whole brain just the unused bits.
It’s what matrix would’ve been if the studios didn’t think people would too dumb to get it, so we ended with the nonsense about batteries.
I would love it (if there exists a FOSS variant of that) imagine being able to run a LLM, or even LAM in your head,
wait…
🤔