LegitHyperbole said:
Qwark said:
Depends on what the quality mode offers I guess. If it's slightly less stable, but makes the image look noticeably better I would go for it. Although this is more akin Stellar Blade Performance and Performance RT mode in Ratchet and Clank rift apart than dedicated quality.
If it drops to 30 just for a few extra pixels, I would say quality all the way. I can live with a few drops though for a Performance RT mode. Since it makes the game look a lot better.
|
Buy does the game not feel worse for you to play? Like the camera doesn't move smoothly or characters look like they are jitterbug or God forbid, do you not notice the trailed effect from the choppiness?
|
Honestly no. I think people underestimate how quickly the mind adapts to a stable FPS but it depends on the execution. For example light motion blur really solves camera choppiness. Per character motion creates smoother motion with characters. Frame pacing creates good 30fps modes.
What I think is crazy is that there people who blindly pick 60fps regardless of the experience. I watched a play through of FFXVI bosses where the frame rate was fluctuating between 30fps-50fps all the time and it was one of the most nauseating experiences. Or games where the low resolution causes constant FSR artefacting (jedi survivor)
The truth with cernys statistics is that it is likely based off Sony first party where the 60fps modes are actually rock solid. But also based off player ignorance... Most gamers who I've visited do not know when their TV has the God awful "true motion/super motion" frame interpolation software active where the software invents frames. I think their experience is more informed by how the modes are labeled then anything else.
This is why I'm all in favour of developers just building the experience they imagine with maximum optimisation
Last edited by Otter - on 05 January 2025