By using this site, you agree to our Privacy Policy and our Terms of Use. Close

I think we're seeing a few things at play.

1. Developers did not into this generation with 30fps in mind.

It's clear that in many cases, they built the game around 30fps and this is where the quality is, only late in the development is where they throw in the 60fps mode to avoid online backlash, these 60fps modes often times are not well tested or optimised, hence the low internal resolutions + bad performance. My honest opinion with performance modes is that less games should have them. They were a cross gen thing where resources were abundant but as a whole they're now creating fractured experiences for console gamers where toggling between the two makes each respective option feel worse and typically one option is not well optimised but the gamer has already been sold the promise of a 60fps mode. People are overtly aware of what they're missing and unlike PC you can't simply adapt the experience on a settings level to get your desired look/feel, nor can you just set your eyes on a GPU upgrade.

Past generations gamers have always jumped between 30/60fps without much problem, i suspect online toxicity is one driving force (i.e was no twitter/DF to bring everyones attention to fact that Ratchet went from 60fps on PS2 to 30fps on PS3, or start uproar about how lazy the developers were or how it shows the PS3s weakness etc) but also these past jumps were in different packaged games, not one game with a split second toggle. i.e going from playing mario kart to Zelda, your brain refreshes its expectations of how it see's the moving image. But swicthing in real time in the same game requires at least a few mins of gameplay to readjust. In some case though we had the same game like Uncharted 4 single player (30fps) vs multiplayer (60fps) but still it seemed gamers didn't complain about the 30... so that is curious.

2. I think huge resouces are being wasted on ray traycing features. It's no surprise that the best looking game of this generation on consoles (Horizon) forgoes raytracing entirely to just focus on real world detail, FX and image quality. Watching DF compare the raytracing in and out of Star Wars, in some scenes it makes a nice difference to but in many it just looks either the same or just like different lighting not obviously "better" lighting. Huge resources are being put towards adhering to realism the average brain takes for granted and instead should be dedicated to details which do objectively stand out, like world detail, image quality, density or other simulations. Another case and point TLOU2 still looks better than most 3rd party PS5 games.

3. All Pro games need to have the option to toggle PSSR off where they fallback on FSR or another solution, just the same way that PC games can toggle DLSS. Or better yet a Pro mode vs regular mode, where the game plays the regular PS5 code but just with the 40% boost GPU. Surely that'd be enough get a nice booster from a dynamic res game and none of the faults of a poorly implemented PSSR solution. The true fault is developers not doing QA properly, because who one earth would look at those PSSR artifacts in Jedi and say, cool lets roll out that patch?? Sony is probably keen on developers not being too ambitious and focusing on their tagline (60fps at quality mode). Developers trying to add in new features on top and thus sacrificing the resolution bump we should be seeing in the performance modes seems to be a big problem.

Last edited by Otter - 2 days ago