JRPGfan said:
Theres only 12 games so far (afaik) that make use of Nvidia DLSS 2.0 (the older versions suck worse than scaleing imagine + sharpening filter). |
Right... And there's not much of a reason to think AMD's version won't be as bad as DLSS 1.0 as they are starting from scratch... Who knows how long it can take for AMD to have a competitor to DLSS let alone one that looks as good as 2.0 as Nvidia continues to improve DLSS... Specially since they did not mention a specific hardware accelerator similar to Nvidia's Tensor Cores unless I missed it.
If the benchmark you are referring to is the one where the 72CU performs like a 2080 Ti when RT is enabled. That does not bode well for the rest of 6000 series RT performance. Because that means that the 2080 Ti's/3070's Raster competitor will perform like 2070 Super when RT is enabled and etc. And because of that, AMD would have an overall worse implementation of Ray Tracing than Turing.
Considering Watch Dogs Legion needs a 3090 + DLSS performance mode to run 4k 60fps with Ray Tracing on Ultra according to the leaked videos, I hope that benchmark isn't legit.
PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850