haxxiy said:
TPU managed to hit these frequencies with two out of two though (the XFX and the TUF) while Guru3D's TUF performed worse than TPU's reference card (much lower mem clocks and just 20 MHz more in the GPU clock). Could be a bad sample, or could be because G3D is using Afterburner instead. Another bugged software wouldn't be a surprise here. As for ray tracing, I'm thinking more of the average rather than the worst-case scenario (Steve from GN was told by AMD that Cyberpunk 2077's RT is bugged in Radeon). I mean, it's fair to take that into account since it's RTG's fault, but other games exist, too. |
They did but the point is that even at 3Ghz, it doesn't actually mean it will get near a 4090 when measured across other games as they all scale differently with RDNA 3. 200Mhz memory OC rarely makes a difference in between the games.
You also can't cherry pick the best case scenario and use that to assume every other game will scale linearly. Remember that AMD said in their press event that Cyberpunk was one of the games that had 1.7x in Raster improvement while the rest at 1.5x so if anything, that is the outlier. You can also pick other RT games that actually does have proper Ray Tracing like Control which shows a similar 3090 Ti tier RT performance if we assume the best case scenario:
The other issue with the relative chart is that it includes games like Far Cry 6 which isn't actually all that RT heavy so that favors AMD. So if you were to say 4080 level RT but include games that have bad RT, it would be very miss leading because if a person buys a 7900XTX thinking that OCing will allow 4080 level RT and then plays actual RT games, it wouldn't perform as expected.
Just to be clear, I am not saying 7900XTX can't perform like what Techpowerup showed, I am saying that we need to wait for more evidence before assuming it can across the board.
PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850