By using this site, you agree to our Privacy Policy and our Terms of Use. Close
JEMC said:
Jizz_Beard_thePirate said:

This is kind of why it's hard for high end gpu buyers to consider Radeon over Nvidia. 9070 XT falling behind a 4070 at 1440p with path tracing means they still have some work to do architecturally. This is also why I like Radeon targeting mid range because at that price point, compromises in feature set like in heavy RT titles matters a lot less vs in vram where Nvidia is only giving you 12gb vs 16gb on Radeon.

But at $1000 price point or higher, it's a lot to spend on a gpu that has compromises in features like RT. Cause at that price point, you should be buying gpus that's good at everything. But least they caught up in upscaling so maybe with udna.

We'll have to see what happens when MSoft's DXR 1.2 rolls out and AMD and Intel enable it, if they can, in their GPUs. From what we know, Nvidia has already been using opacity micromaps for a couple years, giving them a perfromance advantage over the competition, and it will only be able to benefit from the shader execution reordering.

But that will take time, maybe long enough that it will only come into effect with the successors of the cards that are launching now.

Are we actually expecting much from the new DXR features for RT and the neural shading to lower vram usage or is it just something that sounds great but never really materializes?