By using this site, you agree to our Privacy Policy and our Terms of Use. Close
HoloDust said:
sc94597 said:

Everything I have read and experienced on the topic has been a few % in either direction depending on the game's denoisers. 

Sometimes there is a performance uplift because multiple shader-based denoisers that consume CUDA-core resources are replaced by the RR model that utilizes tensor cores. Other times there is a penalty if the denoiser it replaces was already light-weight. 

Cyberpunk 2077 used to require DLSS SR to be active to use RR, so I can see there being a heavy performance penalty there. But they updated that IIRC?

Haven't really tested this myself, so pulling data from net, but it seems that RTX 40 & 50 are those few %. Ampere gets hit much more.

Looking into it, this occurred especially in the new implementation with DLSS 4. 

The old RR model seemed to have about a 6% performance penalty on an RTX 3080  in CP2077 and an 11% penalty on a 3060. 

About 12% for a 3050.

So yeah, more significant than I remembered, even with the old model. 

S2 has about 60% of the tensor cores of the 3050 desktop, so probably not doable, unless they could do a lightweight/distilled model.