JEMC said:
Just because Nvidia has decided to increase the power concumption of their cards, that doesn't mean AMD followed the same route. It wouldn't be a complete surprise that AMD saw the 320 or so Watts the 6900XT used and decided that it was already too much... I know, I know, it's highly unlikely given the rumors we've had so far. But if the 7900XTX will use more than 375W, then it's weird that AMD went with two different designs. Also, AIB cards always cost more than reference ones, both for AMD and for Nvidia, and if Nvidia's cooler happens to be better and more expensive than AMD's reference one, then the custom cards will reflect that extra cost. |
Yea but it feels like they are going to be sacking features to achieve that. Nvidia GPUs don't use more power or have a bigger die for no reason, they have Tensor Cores and RT cores take up a good chunk of the area that still use power even if you aren't using the features. It's one of the reasons why 6000 series was very power efficient compared to Ampere. RDNA 2 focused a lot on standard cores but basically left out dedicated Ai accelerators and more performant RT cores. Also using G6/TSMC instead of G6X/Samsung helped greatly too but now it's TSMC vs TSMC.
So it could be another case of competitive Raster but falls behind in a lot of other areas which would be lame but we will see this Thursday.
PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850