By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - Nvidia Is Going To Massively Raise GPU Prices/Cut Supply For Gamers

HoloDust said:
sc94597 said:

We already have ASICs in the form of TPUs  and other NPUs. Most companies (the behemoth that is Google excepted) don't invest in them, despite their efficiency gains, because they don't want to be stuck with hardware they can't use if the architecture-requirements shift, which is far less arbitrary than crypto-mining and can happen very unpredictably. GPUs are less efficient but less risky investments. There is also the software-hardware ecosystem inertia. 

Can't say I follow AI market much (apart from what's directly tied to my job), just extrapolating from previous trends and articles like this one:

https://venturebeat.com/infrastructure/inference-is-splitting-in-two-nvidias-usd20b-groq-bet-explains-its-next-act

Anyhow, I expect if nVidia and AMD ditch consumer GPU market (for the most part), that there will be other players who will jump in.

There's not many companies that can make GPUs, it's why even Sony/Nintendo/Microsoft are basically stuck choosing between AMD or Nvidia most of the time. 

To create some new GPU gaming architecture that could viably compete would be a massive undertaking for someone else and they would hamstrung by the same modern reality of chip production ... things like RAM and other key components are going to be sky high because every bit of that which goes into some gamer's GPU could have been used for a more profitable AI server instead. 

Last edited by Soundwave - on 03 January 2026

Around the Network
Soundwave said:

things like RAM and other key components are going to be sky high because every bit of that which goes into some gamer's GPU could have been used for a more profitable AI server instead. 

yay!



Soundwave said:
HoloDust said:

Can't say I follow AI market much (apart from what's directly tied to my job), just extrapolating from previous trends and articles like this one:

https://venturebeat.com/infrastructure/inference-is-splitting-in-two-nvidias-usd20b-groq-bet-explains-its-next-act

Anyhow, I expect if nVidia and AMD ditch consumer GPU market (for the most part), that there will be other players who will jump in.

There's not many companies that can make GPUs, it's why even Sony/Nintendo/Microsoft are basically stuck choosing between AMD or Nvidia most of the time. 

To create some new GPU gaming architecture that could viably compete would be a massive undertaking for someone else and they would hamstrung by the same modern reality of chip production ... things like RAM and other key components are going to be sky high because every bit of that which goes into some gamer's GPU could have been used for a more profitable AI server instead. 

This.

GPU's are hard to make good... It's an entire ecosystem you need to get working, not just the hardware, but the software as well.

S3 last attempt was with their Chrome series of graphics chips which dropped around 2004... And were "relevant" until around 2009.
And whilst they were relatively performant, they suffered one caveat... Compatibility and drivers.

In VIA's case though, it didn't matter if Chrome managed to secure marketshare, just developing their graphics I.P. was sufficient enough as it would end up in integrated graphics anyway.

Since HTC bought S3 Graphics, they are just used for S3's patents which are extremely extensive.

Intel despite having the most "graphics chips" in the PC market for most of it's life, struggled to make a decent graphics processor with decent software support.
I's only relatively recently where they have been investing heavily in GPU development, mostly as a response from nVidia who is making trillions from graphics chips and A.I... But even as of today, with billions invested, their drivers are still not as seamless as AMD or nVidia's and they have only secured a rounding error in marketshare in the discreet GPU space.

BUT. It is improving their integrated graphics which have improved drastically in recent years... I don't even call them Intel Decelerators anymore.

We also have Matrox who has always stuck in the market, but essentially stopped GPU development after their Parhelia couldn't garner marketshare... Now they rebadge AMD or Intel GPU's for professional markets.

We had STMicro with it's Kyro graphics chips... Which again, despite being very budget friendly, did exhibit artifacting/missing geometry in games as it just lacked that refined software support... Eventually the company would do a full pivot and focus on mobile, where they had great success with PowerVR.

GPU's are hard to make, but they are also hard to continuously develop to keep pace with AMD and nVidia's release cadence.




www.youtube.com/@Pemalite

Soundwave said:
HoloDust said:

Can't say I follow AI market much (apart from what's directly tied to my job), just extrapolating from previous trends and articles like this one:

https://venturebeat.com/infrastructure/inference-is-splitting-in-two-nvidias-usd20b-groq-bet-explains-its-next-act

Anyhow, I expect if nVidia and AMD ditch consumer GPU market (for the most part), that there will be other players who will jump in.

There's not many companies that can make GPUs, it's why even Sony/Nintendo/Microsoft are basically stuck choosing between AMD or Nvidia most of the time. 

To create some new GPU gaming architecture that could viably compete would be a massive undertaking for someone else and they would hamstrung by the same modern reality of chip production ... things like RAM and other key components are going to be sky high because every bit of that which goes into some gamer's GPU could have been used for a more profitable AI server instead. 

Among other things  (like Intel and Chinese GPUs), we'll see where we are in few years when there is ecosystem of not-so-expensive ARM based handhelds that are running Steam/SteamOS natively, and if those mobile SoCs and architectures will proliferate into desktop space as well packaged in GabeCube alike products.

If there is opening in market, someone will inevitably jump in to fill it - it's the nature of the beast.



Keep in mind that your "$5000 according to Nvidia" is a Gemini-generated rumor from a Korean tech forum after a long telephone whispers game.

Yes, Nvidia has absurd 80% gross margins in Blackwells and the like, but so did Intel for their Xeon server parts back in their golden days, and a few years from now, Chinese manufacturers are going to sweep in so fast in the RAM and GPU markets that you won't even believe it.



 

 

 

 

 

Around the Network
haxxiy said:

Keep in mind that your "$5000 according to Nvidia" is a Gemini-generated rumor from a Korean tech forum after a long telephone whispers game.

Yes, Nvidia has absurd 80% gross margins in Blackwells and the like, but so did Intel for their Xeon server parts back in their golden days, and a few years from now, Chinese manufacturers are going to sweep in so fast in the RAM and GPU markets that you won't even believe it.

This.  Gaps get filled.  If nvidia rolls on, someone else will jump in.  



“Consoles are great… if you like paying extra for features PCs had in 2005.”

Mid-sized and emerging companies can seize the opportunity and fill the gaps left by nVidia and AMD. But it would be awkward as hell if PS7 ditches backwards compatibility with 3 generations when PS6 is fully backwards compatible with PS5 and PS4 lol. That's a massive ass library to casually delete in this day and age. A non-nVidia "Switch 3" would also drop BC with two major consoles.

Ideally, Sony should've remained a GPU/CPU designer after the PS2. But PS3's failure scared them out of the business. Now they, like Nintendo, are at the mercy of chip makers' and designers' whims.



Anyone who could create any kind of GPU architecture that can even remotely compete with Nvidia isn't going to focus on gaming, lol. They'll go straight to AI too and watch their stock price skyrocket. Investors don't give a shit about gaming, it's a small potatoes business.

It's like saying someone who has the ability and talent to play in the NBA will just rather play for your local men's league instead, they'd be pretty stupid to do that with that skill set. Now maybe in the 1950s or something before big ticket contracts were a thing for pro athletes, that might have been even plausible but today that would just be stupid. 

AI has forever changed the GPU game, might as well accept it. 

Last edited by Soundwave - on 11 January 2026

Kyuu said:

Mid-sized and emerging companies can seize the opportunity and fill the gaps left by nVidia and AMD. But it would be awkward as hell if PS7 ditches backwards compatibility with 3 generations when PS6 is fully backwards compatible with PS5 and PS4 lol. That's a massive ass library to casually delete in this day and age. A non-nVidia "Switch 3" would also drop BC with two major consoles.

Ideally, Sony should've remained a GPU/CPU designer after the PS2. But PS3's failure scared them out of the business. Now they, like Nintendo, are at the mercy of chip makers' and designers' whims.

x86 and graphics programming APIs probably aren't going anywhere and even if they are, emulation exists if there's enough interest in it. It's not like backwards compatibility has always been enabled by hardware.

Soundwave said:

Anyone who could create any kind of GPU architecture that can even remotely compete with Nvidia isn't going to focus on gaming, lol. They'll go straight to AI too and watch their stock price skyrocket. Investors don't give a shit about gaming, it's a small potatoes business.

It's like saying someone who has the ability and talent to play in the NBA will just rather play for your local men's league instead, they'd be pretty stupid to do that with that skill set. Now maybe in the 1950s or something before big ticket contracts were a thing for pro athletes, that might have been even plausible but today that would just be stupid. 

AI has forever changed the GPU game, might as well accept it. 

There could still be room for something that's poor at AI but good enough at graphics at a price that's not driven up by AI needs. I wouldn't bet on it happening, but it certainly could. It's definitely a profitable market if you can find a way to enter it, so if the AI-capable companies neglect gaming too badly, someone will fill in the void sooner or (probably) later.



My 7800X3D rig with a 4090 and 64GB DDR-RAM from mid 2023 is suddenly looking like a terrific idea and a real steal, in hindsight. I can't believe how absurd the pricing has gotten. Just about the only part of the space that's getting more reasonable with pricing is monitors - but the kicker is that average consumer won't want to (or can't afford to) spend what's required on hardware to make full use of said monitors and their shiny features.