Mummelmann said:
This is very likely the outcome; when the intended use of the massive data-centers for AI (inevitably) turns foul, the larger players will sell compute subscriptions. In essence, hardware as a concept will get centralized and we'll see even less consumer control and influence on functions, ads, protective layers and customization. It may well turn out that Stadia was more or less the right idea (from a purely corporate perspective), but too soon. |
Yup ask yourself too how much compute do you really need for LLMs ... they've already absorbed all the data on the internet and some of those models can run offline.
So if you're an Nvidia what do you need? You need all these AI hyperscalers/buyers to have to keep buying newer and newer AI servers ... LLMs aren't going going to cut it ... you need video generation and real time environment production (read: video games) which require much more compute to keep pushing that GPU spend at data centers.
Introducing essentially filters into gaming is the first big step towards that.







