By using this site, you agree to our Privacy Policy and our Terms of Use. Close
haxxiy said:
Captain_Yuri said:

I wouldn't be too hyped about the OC numbers until we see more. Techpowerup needed to push the clocks up to 3.2Ghz with aggressive OC, unvolting and Turning to get near the 4090 stock levels in Cyberpunk Raster and we have no idea how much power they needed to push. Most review outlets only managed to OC to 3Ghz with the TUF and the performance ended up being only 5% more in averages which you can easily do with a 4080 as well:

https://www.guru3d.com/articles_pages/asus_tuf_gaming_radeon_rx_7900_xtx_oc_review,30.html

Not to mention, 4080 level Ray Tracing would still be too far out of reach even with OC:

At best, it will likely be 3090 Ti levels

So it's better to wait and see before claiming it will be near the 4090 Raster or 4080 RT levels but it certainly should have been higher than what the reference shipped with.

TPU managed to hit these frequencies with two out of two though (the XFX and the TUF) while Guru3D's TUF performed worse than TPU's reference card (much lower mem clocks and just 20 MHz more in the GPU clock).

Could be a bad sample, or could be because G3D is using Afterburner instead. Another bugged software wouldn't be a surprise here.

As for ray tracing, I'm thinking more of the average rather than the worst-case scenario (Steve from GN was told by AMD that Cyberpunk 2077's RT is bugged in Radeon). I mean, it's fair to take that into account since it's RTG's fault, but other games exist, too.

They did but the point is that even at 3Ghz, it doesn't actually mean it will get near a 4090 when measured across other games as they all scale differently with RDNA 3. 200Mhz memory OC rarely makes a difference in between the games.

You also can't cherry pick the best case scenario and use that to assume every other game will scale linearly. Remember that AMD said in their press event that Cyberpunk was one of the games that had 1.7x in Raster improvement while the rest at 1.5x so if anything, that is the outlier. You can also pick other RT games that actually does have proper Ray Tracing like Control which shows a similar 3090 Ti tier RT performance if we assume the best case scenario:

The other issue with the relative chart is that it includes games like Far Cry 6 which isn't actually all that RT heavy so that favors AMD. So if you were to say 4080 level RT but include games that have bad RT, it would be very miss leading because if a person buys a 7900XTX thinking that OCing will allow 4080 level RT and then plays actual RT games, it wouldn't perform as expected.

Just to be clear, I am not saying 7900XTX can't perform like what Techpowerup showed, I am saying that we need to wait for more evidence before assuming it can across the board.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850