By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Even if we accept for the sake of argument that the whole data center argument is a "silver bullet" argument against LLMs in the here and now - leaving aside the fact that cloud computing has been a widespread thing since the early 2010s, and even if Google and the like mothballed all their AI models tomorrow they'd likely just convert the data centers into ones handling e-mail and cloud storage - that's probably not going to be the case forever.

15 years ago, IBM wrote an AI model that was able to compete in a special episode of Jeopardy and whooped the asses of two of the show's greatest champions (and thought Toronto was in the U.S.; yeah, even back then hallucinations were a thing). At the time, running that model required a giant server farm of the likes used to run modern-day LLMs. Nowadays, that model would almost certainly run on a PC with 2 or 3 high-end GPUs and 128GB+ of memory - something prohibitively expensive for most people, but not as utterly ludicrous as the hardware required back then.

So yeah, I've said before, I think we're in for an "AI bust" in the next year or so... but if it's anything like the "Dotcom Bust" of the early 2000s, it'll just delay things until the technology catches up to the point where any smartphone can do what you need huge data centers to do today (and again, I think some people under-estimate what's possible even right now with on-device processing).