Biggerboat1 said:
It's difficult to find specific benchmarks for the RTX 30350 laptop that show 4K DLSS performance vs 1440p native I guess it's not the sexiest GPU so didn't attract a lot of coverage), though generally the consensus seems to be that using DLSS is highly recommended for the GPU precisely because of it's underwhelming specs... Here's Cyberpunk benchmarks using DLSS on a RTX 3050 (desktop version, obvs a bit more powerful). It shows a significant increase in frame rates when using DLSS no matter the resolution. I'm admittedly no expert but in what scenario would it not be beneficial to use DLSS? In it's recent iterations there's negligible visual degradation/artifacting unless the gulf in native vs output res is big. For MKW, even forgetting 4K DLSS Performance, why would it not be better to run the game at 1440p DLSS balanced (1080p internal) than native 1440p, then free up headroom for upping environmental density or whatever? Genuine question. If I remember correctly TOTK used FSR on S1... If S1 can handle upscaling surely S2 should find it a breeze. Can you show an example where any GPU has worse performance when using DLSS vs native? (again, genuine question, trying to figure all of this stuff out) Could it be Nintendo just being Nintendo & inexplicably ignoring the feature, just like AA? |
Zelda TOTK and Splatoon 3 use FSR 1.0







