By using this site, you agree to our Privacy Policy and our Terms of Use. Close
LegitHyperbole said:
Otter said:

I think we're seeing a few things at play.

1. Developers did not into this generation with 30fps in mind.

It's clear that in many cases, they built the game around 30fps and this is where the quality is, only late in the development is where they throw in the 60fps mode to avoid online backlash, these 60fps modes often times are not well tested or optimised, hence the low internal resolutions + bad performance. My honest opinion with performance modes is that less games should have them. They were a cross gen thing where resources were abundant but as a whole they're now creating fractured experiences for console gamers where toggling between the two makes each respective option feel worse and typically one option is not well optimised but the gamer has already been sold the promise of a 60fps mode. People are overtly aware of what they're missing and unlike PC you can't simply adapt the experience on a settings level to get your desired look/feel, nor can you just set your eyes on a GPU upgrade.

Past generations gamers have always jumped between 30/60fps without much problem, i suspect online toxicity is one driving force (i.e was no twitter/DF to bring everyones attention to fact that Ratchet went from 60fps on PS2 to 30fps on PS3, or start uproar about how lazy the developers were or how it shows the PS3s weakness etc) but also these past jumps were in different packaged games, not one game with a split second toggle. i.e going from playing mario kart to Zelda, your brain refreshes its expectations of how it see's the moving image. But swicthing in real time in the same game requires at least a few mins of gameplay to readjust. In some case though we had the same game like Uncharted 4 single player (30fps) vs (60fps) but still it seemed gamers didn't complain about the 30... so that is curious.

2. I think huge resouces are being wasted on ray traycing features. It's no surprise that the best looking game of this generation on consoles (Horizon) forgoes raytracing entirely to just focus on real world detail, FX and image quality. Watching DF compare the raytracing in and out of Star Wars, in some scenes it makes a nice difference to but in many it just looks either the same or just like different lighting not obviously "better" lighting. Huge resources are being put towards adhering to realism the average brain takes for granted and instead should be dedicated to details which do objectively stand out, like world detail, image quality, density or other simulations. Another case and point TLOU2 still looks better than most 3rd party PS5 games.

3. All Pro games need to have the option to toggle PSSR off where they fallback on FSR or another solution, just the same way that PC games can toggle DLSS. Or better yet a Pro mode vs regular mode, where the game plays the regular PS5 code but just with the 40% boost GPU. Surely that'd be enough get a nice booster from a dynamic res game and none of the faults of a poorly implemented PSSR solution. The true fault is developers not doing QA properly, because who one earth would look at those PSSR artifacts in Jedi and say, cool lets roll out that patch?? Sony is probably keen on developers not being too ambitious and focusing on their tagline (60fps at quality mode). Developers trying to add in new features on top and thus sacrificing the resolution bump we should be seeing in the performance modes seems to be a big problem.

I'm a fan of the choices, I particularly like Stellar blades solution where they off a balanced mode between the two and it works for me perfectly. I agree though, RT needs to be ignored until we have hardware that can actually handle it or do something significant with it. It's pitiful of bas ps5 and while better on pro it's still too much of a cost. Lazy devs think they can get away with this instead of working on the games graphics. 

I works in Spiderman 2 though. The default RT options that is. It managed a locked 30fps on base PS5 and now a locked 60fps on PS5 Pro. Just if you go for fidelity mode on Pro it throws in more hardly noticeable RT improvements which without a VRR display create terrible fps dips while swinging around.

That has also become a problem, relying on VRR. It's great for higher frame rates over 60 fps to maintain perceived stability, but it's a bad crutch for games that can't reach 60 fps. Big swings under 60 fps remain very distracting and don't help image reconstruction techniques.


Horizon looks amazing yet there running around water you notice it's still limited by screen space reflections, at the edges the reflections are missing or look weird. You only see it while running along water, not all that distracting. Certainly not like huge frame drops or upscaling artifacts.

I'm sure RT can put more 'realism' in scenes like this

Flood it with shadows and ray traced lighting. Yet for now it's not worth the performance cost. RT is great when you can switch to it and not have to do all the lighting work, shadow and reflection maps. But now it's just an extra, no time saver, more work. And thus not the priority for optimization.

RT might make this look better too, or worse

Running at a locked 60 FPS makes it look more impressive than adding RT reflections at 30 fps, imo.

It looks great without RT, not missing it.


Just like VR, RT is still in its experimental stage. Too early to switch to, can add depth but mostly adds performance woes.