By using this site, you agree to our Privacy Policy and our Terms of Use. Close
JRPGfan said:
Captain_Yuri said:

The question is of course, how much better and which card cause the 3080 is 25-40% faster than 2080 Ti when RT is enabled depending on the game.

These are cherry picked benchmarks so I will wait for the actual reviews. The 1440p benchmarks are actually quite closer than I was thinking as there's 4/10 titles where AMD has a huge edge against the 3080.

Yea it is pretty crazy that after what? 3-4 generations of GPUs, we finally have an AMD GPU that can go toe to toe in Raster. There's no way to put it other than amazing.

Then the question is which games will support AMD's version of DLSS. Not to mention, how's the quality of the image?

Theres only 12 games so far (afaik) that make use of Nvidia DLSS 2.0 (the older versions suck worse than scaleing imagine + sharpening filter).
DLSS techniques shouldn't be a reason to buy or not a card, going forwards I suspect.

That 3Dmark benchmark with raytraceing showed a 22% lead for the 3080 FE vs the 6800XT (which will shrink with driver optimisations going forwards). Thats heavy loads of raytraceing, more than most games will make use of. So even in ray traceing situations the advantage to the nvida cards wont be that big.

Right... And there's not much of a reason to think AMD's version won't be as bad as DLSS 1.0 as they are starting from scratch... Who knows how long it can take for AMD to have a competitor to DLSS let alone one that looks as good as 2.0 as Nvidia continues to improve DLSS... Specially since they did not mention a specific hardware accelerator similar to Nvidia's Tensor Cores unless I missed it.

If the benchmark you are referring to is the one where the 72CU performs like a 2080 Ti when RT is enabled. That does not bode well for the rest of 6000 series RT performance. Because that means that the 2080 Ti's/3070's Raster competitor will perform like 2070 Super when RT is enabled and etc. And because of that, AMD would have an overall worse implementation of Ray Tracing than Turing.

Considering Watch Dogs Legion needs a 3090 + DLSS performance mode to run 4k 60fps with Ray Tracing on Ultra according to the leaked videos, I hope that benchmark isn't legit.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850