By using this site, you agree to our Privacy Policy and our Terms of Use. Close
HoloDust said:
Chrkeller said:

My experience aligns with yours.  I think DLSS works really well for 720p rendering (or better).  It loses quality quickly when going from super low, like 540p, to 1080p.  

I'll add, I absolutely cannot tell the difference between 4K quality and native 4k.  I just cannot.  It is a great benefit to gaming to get a 4k image without wasting the resources.  Though to be fair, IMO, the difference between 4k and 1440p is negligible on a 55-inch screen.  Perhaps on massive 75-inch screens there is a difference, but 1440p (for most people) seems more than enough. 

Personally, I think the resolution wars need to stop, and developers should focus on fps.  Given a choice between 4k at 60 fps or 1440p at 120 fps...  the latter all day long.  

I remember there were some tests (subjective of course) where in many titles, folks preferred 4K DLSS Quality over 4K Native. So yeah, wasting GPU/CPU resources on 4K native is silly with such a good upscaler tech. I think we'll see lot more of what Ubi did for Star Wars Outlaws, putting upscaler into official system recommendations for game.

If I were going to be super picky, only issue I have ever seen with DLSS 1440p to 4k was last of us part 1.  Metal railing in the distance shimmered a bit.  But that is a very poor port and being super picky.  I mean who cares if a railing 300 ft away has a slight shimmer when DLSS took me from 60-80 fps to 110-120 fps?

No issues with any other game.  In fact, quite often, I run 1440p DLSS to 4k even if my rig can do native.  It keeps temps way down and the rig runs super quiet.  



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED