By using this site, you agree to our Privacy Policy and our Terms of Use. Close
curl-6 said:
Chrkeller said:

Yeah but that isn't really my point.  RT on consoles is low.  When RT is low the quality difference between baked in and RT is negligible.  RE4 is an example of this.  RT on or off is minor at best.  RT shows a stark difference when high, which consoles can't handle, especially the S2.  We are a generation away before we see the real benefits of RT.  Low RT versus off makes no difference in most games.  Half the time I pause the screen and have to hunt to even notice what low RT does.  Now extreme RT is gorgeous but also gives my 4090 20 fps.  

If we're talking perception, one could argue that say going from 1080p on PS4 to 4K on PS5 isn't a huge perceptible difference due to diminishing returns, but there's still a major technical difference between the two, same for say realtime vs baked shadows, volumetrics vs alpha for things like smoke/fog, etc.

Personally, I find the difference between even console RT and 8th gen rasterized lighting can be pretty significant in games like say Metro Exodus.

Preaching to the choir.  Resolution is overrated.  O can't tell the difference between 1440p and 4k.  I tend to go 1440p for 100 fps over 4k 60 fps.  Resolution is diminishing returns.  Fps aren't.  120 fps is bliss.  

Perhaps the reason we aren't aligned is I'm all about fps.  30 fps can go **** itself.  

Last edited by Chrkeller - on 14 August 2025

i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED