By using this site, you agree to our Privacy Policy and our Terms of Use. Close
JEMC said:

So, if the synthetic benchmarks results translate to gaming, Nvidia's new cards will bring quite a nice jump all across the board. Let's hope it's true... and at what cost (both $ and W) they achieve those results.

Meanwhile, now we know that AMD's high end card will use 525W at most. A ridiculous amount. Let's hope the performance jump is worth it.
At least we know it will come with 24GB of VRAM.

And thank God the Ethereum nightmare is over! (Well, almost). Let's hope mining stays quite for a while and leaves us buy GPUs at retail prices, not speculative prices.

I think at the very least, we now know that 7900XT will be at 375 watts at a minimum. Most likely 400 watts. The question is whether or not AMD will do their usual sneaky tdp numbers or if they will go Nvidia route and provide a more legit tdp numbers. Since as we know, Nvidia does their tdp based on the entire card while AMD does their tdp only based on the GPU die and Vram. If they are more honest about it, I think we can see 7900XT being 450 watts. But the real question is where their feature sets are?

Will RDNA 3 have an actual DLSS competitor? Will they be competitive in Ray Tracing? Will they have a competitor to Reflex? How about Nvenc and Optix? I think at the very least, they need to be competitive in Ray Tracing as well as Raster but I get the feeling that when push comes to shove, they will be slightly more efficient while continuing to not be competitive in many areas outside of Raster while charging Nvidia prices.

The reality is the only way Radeon will have a card that's competitive in hardware against Nvidia in every front is to make a card that's as inefficient if not more than Nvidia. But we will see.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850