By using this site, you agree to our Privacy Policy and our Terms of Use. Close
haxxiy said:
hinch said:

Yeah its been pretty awful years for GPU's. And before Ampere we had those awful Turing prices. Ampere was like harking back to the Pascal days in actually delivering good uplifts for reasonable amount of money accross the stack. And now Nvidia seems to regressing back again, this time worse. Especially now that people are used to spending over north end of a near a grand or more on graphics cards.

What doesn't help is that AMD just continues to follow Nvidia's lead. Whether it be to performance to feature sets. And doing the bare minimum to catch up. They need to be much more proactive and not just price adjust their stuff to what Nvidia is doing. RDNA 1 release was meh, didn't try. RDNA 2 was better. 3 well, is just looking okay but still behind on where it counts. Because they wanted to make a reasonable GPU for under a $1000.. as they know they still can't compete with Nvidia in RT anyway. Man, they just needed to make a big ass GPU and screw the noise. Cut the RAM if need be if its using too much power and go balls to the walls.

True enough, but the only way it's getting better is companies going for lower profit margins compared to previous gens, since manufacturing costs will increase even further in the future. The 5090 likely needs to be released for >$2000 and maybe even >$2400 if it comes in the 2-3 nm nodes to keep up the margins.

Would AMD do it when the market is as static as it is even when they offer the best value? I doubt it. Which means Nvidia also won't. Only Intel has a shot of being a disruptive force in the market, but will they?

I think it depends on the amount of sales these products get. If the 4000 series ends up being a Turing situation and if TSMCs prices in general is too high, then it will eventually end up like a domino affect. Not enough product sales means less wafer reservation at TSMC. Less wafer reservation means at some point, TSMC might look at this and bring back their volume discounts because unused wafers are bad for them. Less sales also means the shortages will lossen up which also means bigger discounts.

This is why I think 4090 is a bad example of attempting to judge price trends because no matter which generation, there will always be people looking to buy top end hardware regardless of the price. Imo people should judge trends based on the rest of the stack. And if we look at the 4080 launch, while it is technically sold out in US, there's still plenty of stock in Canada and many other regions where as with the 4090, it was sold out within minutes. I think after the initial sales of the 4080, it will flop and eventually, get discounted and hopefully, 5000 series will be back to Ampere pricing trends.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850