By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - Is the PS5 PRO an utterly useless device at launch? Reviews are out now.

 

PS5 PRO

Worth the money 24 34.78%
 
Waste of money 45 65.22%
 
Total:69

I find myself defending the PS5 PRO often now, PC players actually think that they could build a PC with 2TB of storage that does what the PS5 Pro does for that money. 🤣 I don't know if I'm being trolled or these are meat bots on duty for Sony to make the pro look good, the latter is kinda working for sure. 



Around the Network

Apparently, a lot of people are reporting issues like this. I don't have hard evidence to support it's widespread. Is something to look out for tho.



Bite my shiny metal cockpit!

SvennoJ said:

Sub 1080p base resolution shouldn't happen in 2024...

Alan Wake 2 (847p), Silent Hill 2 (864p), Dragon Age The Veilguard (835p) and Jedi Survivor (864p) all have image instability issues in their Performance modes with PSSR.

Silent Hill 2, a ps2 game, at 864p on PS5 Pro? Wtf. The screenshots don't look all that impressive until you check out the ones with the fog removed. I guess it's not optimized much.

Jedi Survivor is the worst. Unbelievable that they still can't get it together. 



I think we're seeing a few things at play.

1. Developers did not into this generation with 30fps in mind.

It's clear that in many cases, they built the game around 30fps and this is where the quality is, only late in the development is where they throw in the 60fps mode to avoid online backlash, these 60fps modes often times are not well tested or optimised, hence the low internal resolutions + bad performance. My honest opinion with performance modes is that less games should have them. They were a cross gen thing where resources were abundant but as a whole they're now creating fractured experiences for console gamers where toggling between the two makes each respective option feel worse and typically one option is not well optimised but the gamer has already been sold the promise of a 60fps mode. People are overtly aware of what they're missing and unlike PC you can't simply adapt the experience on a settings level to get your desired look/feel, nor can you just set your eyes on a GPU upgrade.

Past generations gamers have always jumped between 30/60fps without much problem, i suspect online toxicity is one driving force (i.e was no twitter/DF to bring everyones attention to fact that Ratchet went from 60fps on PS2 to 30fps on PS3, or start uproar about how lazy the developers were or how it shows the PS3s weakness etc) but also these past jumps were in different packaged games, not one game with a split second toggle. i.e going from playing mario kart to Zelda, your brain refreshes its expectations of how it see's the moving image. But swicthing in real time in the same game requires at least a few mins of gameplay to readjust. In some case though we had the same game like Uncharted 4 single player (30fps) vs multiplayer (60fps) but still it seemed gamers didn't complain about the 30... so that is curious.

2. I think huge resouces are being wasted on ray traycing features. It's no surprise that the best looking game of this generation on consoles (Horizon) forgoes raytracing entirely to just focus on real world detail, FX and image quality. Watching DF compare the raytracing in and out of Star Wars, in some scenes it makes a nice difference to but in many it just looks either the same or just like different lighting not obviously "better" lighting. Huge resources are being put towards adhering to realism the average brain takes for granted and instead should be dedicated to details which do objectively stand out, like world detail, image quality, density or other simulations. Another case and point TLOU2 still looks better than most 3rd party PS5 games.

3. All Pro games need to have the option to toggle PSSR off where they fallback on FSR or another solution, just the same way that PC games can toggle DLSS. Or better yet a Pro mode vs regular mode, where the game plays the regular PS5 code but just with the 40% boost GPU. Surely that'd be enough get a nice booster from a dynamic res game and none of the faults of a poorly implemented PSSR solution. The true fault is developers not doing QA properly, because who one earth would look at those PSSR artifacts in Jedi and say, cool lets roll out that patch?? Sony is probably keen on developers not being too ambitious and focusing on their tagline (60fps at quality mode). Developers trying to add in new features on top and thus sacrificing the resolution bump we should be seeing in the performance modes seems to be a big problem.

Last edited by Otter - 2 days ago

Otter said:

I think we're seeing a few things at play.

1. Developers did not into this generation with 30fps in mind.

It's clear that in many cases, they built the game around 30fps and this is where the quality is, only late in the development is where they throw in the 60fps mode to avoid online backlash, these 60fps modes often times are not well tested or optimised, hence the low internal resolutions + bad performance. My honest opinion with performance modes is that less games should have them. They were a cross gen thing where resources were abundant but as a whole they're now creating fractured experiences for console gamers where toggling between the two makes each respective option feel worse and typically one option is not well optimised but the gamer has already been sold the promise of a 60fps mode. People are overtly aware of what they're missing and unlike PC you can't simply adapt the experience on a settings level to get your desired look/feel, nor can you just set your eyes on a GPU upgrade.

Past generations gamers have always jumped between 30/60fps without much problem, i suspect online toxicity is one driving force (i.e was no twitter/DF to bring everyones attention to fact that Ratchet went from 60fps on PS2 to 30fps on PS3, or start uproar about how lazy the developers were or how it shows the PS3s weakness etc) but also these past jumps were in different packaged games, not one game with a split second toggle. i.e going from playing mario kart to Zelda, your brain refreshes its expectations of how it see's the moving image. But swicthing in real time in the same game requires at least a few mins of gameplay to readjust. In some case though we had the same game like Uncharted 4 single player (30fps) vs (60fps) but still it seemed gamers didn't complain about the 30... so that is curious.

2. I think huge resouces are being wasted on ray traycing features. It's no surprise that the best looking game of this generation on consoles (Horizon) forgoes raytracing entirely to just focus on real world detail, FX and image quality. Watching DF compare the raytracing in and out of Star Wars, in some scenes it makes a nice difference to but in many it just looks either the same or just like different lighting not obviously "better" lighting. Huge resources are being put towards adhering to realism the average brain takes for granted and instead should be dedicated to details which do objectively stand out, like world detail, image quality, density or other simulations. Another case and point TLOU2 still looks better than most 3rd party PS5 games.

3. All Pro games need to have the option to toggle PSSR off where they fallback on FSR or another solution, just the same way that PC games can toggle DLSS. Or better yet a Pro mode vs regular mode, where the game plays the regular PS5 code but just with the 40% boost GPU. Surely that'd be enough get a nice booster from a dynamic res game and none of the faults of a poorly implemented PSSR solution. The true fault is developers not doing QA properly, because who one earth would look at those PSSR artifacts in Jedi and say, cool lets roll out that patch?? Sony is probably keen on developers not being too ambitious and focusing on their tagline (60fps at quality mode). Developers trying to add in new features on top and thus sacrificing the resolution bump we should be seeing in the performance modes seems to be a big problem.

I'm a fan of the choices, I particularly like Stellar blades solution where they off a balanced mode between the two and it works for me perfectly. I agree though, RT needs to be ignored until we have hardware that can actually handle it or do something significant with it. It's pitiful of bas ps5 and while better on pro it's still too much of a cost. Lazy devs think they can get away with this instead of working on the games graphics. 



Around the Network
LegitHyperbole said:
SvennoJ said:

Sub 1080p base resolution shouldn't happen in 2024...

Alan Wake 2 (847p), Silent Hill 2 (864p), Dragon Age The Veilguard (835p) and Jedi Survivor (864p) all have image instability issues in their Performance modes with PSSR.

Silent Hill 2, a ps2 game, at 864p on PS5 Pro? Wtf. The screenshots don't look all that impressive until you check out the ones with the fog removed. I guess it's not optimized much.

Jedi Survivor is the worst. Unbelievable that they still can't get it together. 

It's Star Wars, so who cares anyway.



BraLoD said:
LegitHyperbole said:

Jedi Survivor is the worst. Unbelievable that they still can't get it together. 

It's Star Wars, so who cares anyway.

The first one was rough around the edges but really fun, it had a lot of FromSoft DNA and I've read this one leans more into that. Really cool boss battles too. Alas, I'll not be playing because of artificating in performance mode and just general junk. It'll be a PS6 game for me. 



LegitHyperbole said:
Otter said:

I think we're seeing a few things at play.

1. Developers did not into this generation with 30fps in mind.

It's clear that in many cases, they built the game around 30fps and this is where the quality is, only late in the development is where they throw in the 60fps mode to avoid online backlash, these 60fps modes often times are not well tested or optimised, hence the low internal resolutions + bad performance. My honest opinion with performance modes is that less games should have them. They were a cross gen thing where resources were abundant but as a whole they're now creating fractured experiences for console gamers where toggling between the two makes each respective option feel worse and typically one option is not well optimised but the gamer has already been sold the promise of a 60fps mode. People are overtly aware of what they're missing and unlike PC you can't simply adapt the experience on a settings level to get your desired look/feel, nor can you just set your eyes on a GPU upgrade.

Past generations gamers have always jumped between 30/60fps without much problem, i suspect online toxicity is one driving force (i.e was no twitter/DF to bring everyones attention to fact that Ratchet went from 60fps on PS2 to 30fps on PS3, or start uproar about how lazy the developers were or how it shows the PS3s weakness etc) but also these past jumps were in different packaged games, not one game with a split second toggle. i.e going from playing mario kart to Zelda, your brain refreshes its expectations of how it see's the moving image. But swicthing in real time in the same game requires at least a few mins of gameplay to readjust. In some case though we had the same game like Uncharted 4 single player (30fps) vs (60fps) but still it seemed gamers didn't complain about the 30... so that is curious.

2. I think huge resouces are being wasted on ray traycing features. It's no surprise that the best looking game of this generation on consoles (Horizon) forgoes raytracing entirely to just focus on real world detail, FX and image quality. Watching DF compare the raytracing in and out of Star Wars, in some scenes it makes a nice difference to but in many it just looks either the same or just like different lighting not obviously "better" lighting. Huge resources are being put towards adhering to realism the average brain takes for granted and instead should be dedicated to details which do objectively stand out, like world detail, image quality, density or other simulations. Another case and point TLOU2 still looks better than most 3rd party PS5 games.

3. All Pro games need to have the option to toggle PSSR off where they fallback on FSR or another solution, just the same way that PC games can toggle DLSS. Or better yet a Pro mode vs regular mode, where the game plays the regular PS5 code but just with the 40% boost GPU. Surely that'd be enough get a nice booster from a dynamic res game and none of the faults of a poorly implemented PSSR solution. The true fault is developers not doing QA properly, because who one earth would look at those PSSR artifacts in Jedi and say, cool lets roll out that patch?? Sony is probably keen on developers not being too ambitious and focusing on their tagline (60fps at quality mode). Developers trying to add in new features on top and thus sacrificing the resolution bump we should be seeing in the performance modes seems to be a big problem.

I'm a fan of the choices, I particularly like Stellar blades solution where they off a balanced mode between the two and it works for me perfectly. I agree though, RT needs to be ignored until we have hardware that can actually handle it or do something significant with it. It's pitiful of bas ps5 and while better on pro it's still too much of a cost. Lazy devs think they can get away with this instead of working on the games graphics. 

I works in Spiderman 2 though. The default RT options that is. It managed a locked 30fps on base PS5 and now a locked 60fps on PS5 Pro. Just if you go for fidelity mode on Pro it throws in more hardly noticeable RT improvements which without a VRR display create terrible fps dips while swinging around.

That has also become a problem, relying on VRR. It's great for higher frame rates over 60 fps to maintain perceived stability, but it's a bad crutch for games that can't reach 60 fps. Big swings under 60 fps remain very distracting and don't help image reconstruction techniques.


Horizon looks amazing yet there running around water you notice it's still limited by screen space reflections, at the edges the reflections are missing or look weird. You only see it while running along water, not all that distracting. Certainly not like huge frame drops or upscaling artifacts.

I'm sure RT can put more 'realism' in scenes like this

Flood it with shadows and ray traced lighting. Yet for now it's not worth the performance cost. RT is great when you can switch to it and not have to do all the lighting work, shadow and reflection maps. But now it's just an extra, no time saver, more work. And thus not the priority for optimization.

RT might make this look better too, or worse

Running at a locked 60 FPS makes it look more impressive than adding RT reflections at 30 fps, imo.

It looks great without RT, not missing it.


Just like VR, RT is still in its experimental stage. Too early to switch to, can add depth but mostly adds performance woes.



SvennoJ said:
LegitHyperbole said:

I'm a fan of the choices, I particularly like Stellar blades solution where they off a balanced mode between the two and it works for me perfectly. I agree though, RT needs to be ignored until we have hardware that can actually handle it or do something significant with it. It's pitiful of bas ps5 and while better on pro it's still too much of a cost. Lazy devs think they can get away with this instead of working on the games graphics. 

I works in Spiderman 2 though. The default RT options that is. It managed a locked 30fps on base PS5 and now a locked 60fps on PS5 Pro. Just if you go for fidelity mode on Pro it throws in more hardly noticeable RT improvements which without a VRR display create terrible fps dips while swinging around.

That has also become a problem, relying on VRR. It's great for higher frame rates over 60 fps to maintain perceived stability, but it's a bad crutch for games that can't reach 60 fps. Big swings under 60 fps remain very distracting and don't help image reconstruction techniques.


Horizon looks amazing yet there running around water you notice it's still limited by screen space reflections, at the edges the reflections are missing or look weird. You only see it while running along water, not all that distracting. Certainly not like huge frame drops or upscaling artifacts.

I'm sure RT can put more 'realism' in scenes like this

Flood it with shadows and ray traced lighting. Yet for now it's not worth the performance cost. RT is great when you can switch to it and not have to do all the lighting work, shadow and reflection maps. But now it's just an extra, no time saver, more work. And thus not the priority for optimization.

RT might make this look better too, or worse

Running at a locked 60 FPS makes it look more impressive than adding RT reflections at 30 fps, imo.

It looks great without RT, not missing it.


Just like VR, RT is still in its experimental stage. Too early to switch to, can add depth but mostly adds performance woes.

As far as screen space reflections, I'd like to see more cheap solutions which don't rely on RT, whether it be blending them with cubemaps, or some sort of technique that uses info from previous frames to fill in what is being occluded from the current. The latter could be executed with AI maybe at the fraction of the costs of real RT




SvennoJ said:
LegitHyperbole said:

I'm a fan of the choices, I particularly like Stellar blades solution where they off a balanced mode between the two and it works for me perfectly. I agree though, RT needs to be ignored until we have hardware that can actually handle it or do something significant with it. It's pitiful of bas ps5 and while better on pro it's still too much of a cost. Lazy devs think they can get away with this instead of working on the games graphics. 

I works in Spiderman 2 though. The default RT options that is. It managed a locked 30fps on base PS5 and now a locked 60fps on PS5 Pro. Just if you go for fidelity mode on Pro it throws in more hardly noticeable RT improvements which without a VRR display create terrible fps dips while swinging around.

That has also become a problem, relying on VRR. It's great for higher frame rates over 60 fps to maintain perceived stability, but it's a bad crutch for games that can't reach 60 fps. Big swings under 60 fps remain very distracting and don't help image reconstruction techniques.


Horizon looks amazing yet there running around water you notice it's still limited by screen space reflections, at the edges the reflections are missing or look weird. You only see it while running along water, not all that distracting. Certainly not like huge frame drops or upscaling artifacts.

I'm sure RT can put more 'realism' in scenes like this

Flood it with shadows and ray traced lighting. Yet for now it's not worth the performance cost. RT is great when you can switch to it and not have to do all the lighting work, shadow and reflection maps. But now it's just an extra, no time saver, more work. And thus not the priority for optimization.

RT might make this look better too, or worse

Running at a locked 60 FPS makes it look more impressive than adding RT reflections at 30 fps, imo.

It looks great without RT, not missing it.


Just like VR, RT is still in its experimental stage. Too early to switch to, can add depth but mostly adds performance woes.

Too much cost for too little benefit. In Spiderman 2 you're playing the game at such speed you aren't going to no5ice reflections and overall it just looks like the contrast has been adjusted. In Cyberpunk and The Witcher 3 you'd swear you just adjusted gamma or contrast it's that pitiful. I do see some really cool stuff on PC but even 4090's struggle to bring about the really cool stuff. We are struggling for 60fps even on the PRO 60fps is still an issue with some patches, Raytracing should be completely ignored until the hardware can hit 4k 60 and then, try supplementing with RT but it shouldn't be a tool to ignore creating great graphics. It should be a complete afterthought.