By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - Digital Foundry - PS5pro's PSSR vs PC DLSS/FSR 3.1

Chrkeller said:

And I tend to agree with Pemalite, 16 gb is on the low side for total memory.  One of the reasons I think people are expecting too much from the switch 2, 12 gb isn't what it used to be.  Personally I wouldn't build a rig without 32 gb system ram and at least 16 gb dedicated vram.  Even 16 gb vram I think will be low in a couple of years.  

Keep in mind they don't even get 16GB.
It's 16GB total for System Memory and Video Memory... And it has to share it with the OS/Background.

The PS5 Pro gives 13.7GB for developers to use... The OS/Background steals 2.3GB.

Even Ryzen powered handhelds these days are coming with 32GB DDR5 because it just let's the hardware breathe a little better.

Heck.. Go back 15+ years and my PC back then had 16GB System Memory and 4GB Graphics memory for 20GB total... And I would use it all as well.

Chrkeller said:

Also Square uses their own game engines and they are hot garbage, so don't blame the hardware.  FF16 runs like crap on a $1800 rtx 4090, as an example.  At 4k + max settings the framerate drops into the 50s, which is absurd.   

Making game engines is hard, especially now when there is a degree of visual expectation which requires a fairly large budget to develop and support... 343i and even CD Project Red have now abandoned the Slipspace and Red Engine in favor of Unreal Engine.

Square *used* to make extremely competent game engines that allowed the target hardware to shine... But that hasn't occurred since the PS2 era, I would argue their proficiency peaked on the PS1 with Final Fantasy 8 and 9 showing some extremely good results for the hardware.



--::{PC Gaming Master Race}::--

Around the Network
Chrkeller said:
LegitHyperbole said:

Checking put those new comparisons on the OP... this isn't making the PRO look good, this is making the base ps5 look like dog shit. What the actual fuck. You mean the enthusiast €800 + console is now what is necessary. Ffs. I'm set to buy a ps5 in less than a month but now.... what. You mean these games that don't look much better than on my Ps4 pro graphically look this blurry. What went wrong? Surely the ps5 hardware is more capable than that shit.... I'm actually stumped. What happened to checkerboard rendering that was producing so much better quality on PS4 pro but for the love of God if the developers can't optimise the game for the hardware why are they chasing graphics, bring the fidelity down to the point the console can handle it even if it doesn't look much better than 8th gen. Why fo I own a 4k TV if that's the image I'm getting, this was half sorted with the ps4 pro why is it such a step back....

Settle down my friend.  The ps5 is a significant jump over the ps4.  Those pics are zoomed in and there is no doubt Rebirth smokes Remake in terms of fidelity and size/scope.  The ps5 is a very powerful system, you won't be disappointed.  

Edit

Also Square uses their own game engines and they are hot garbage, so don't blame the hardware.  FF16 runs like crap on a $1800 rtx 4090, as an example.  At 4k + max settings the framerate drops into the 50s, which is absurd.   

Edit

And I tend to agree with Pemalite, 16 gb is on the low side for total memory.  One of the reasons I think people are expecting too much from the switch 2, 12 gb isn't what it used to be.  Personally I wouldn't build a rig without 32 gb system ram and at least 16 gb dedicated vram.  Even 16 gb vram I think will be low in a couple of years.  

Damn, I hope you're right. I suppose looking at it all like that in one area does warp perspective but seeing that DD2 comparison is wild, I have to question why it's so terrible. 

Seems like the whole "engines are scalable now" thing is falling apart. 

FSR seems to be the problem. I wonder can that get better this generation. Like why doesn't Sony work with AMD on that POS. 

Last edited by LegitHyperbole - on 21 October 2024

LegitHyperbole said:
Chrkeller said:

Settle down my friend.  The ps5 is a significant jump over the ps4.  Those pics are zoomed in and there is no doubt Rebirth smokes Remake in terms of fidelity and size/scope.  The ps5 is a very powerful system, you won't be disappointed.  

Edit

Also Square uses their own game engines and they are hot garbage, so don't blame the hardware.  FF16 runs like crap on a $1800 rtx 4090, as an example.  At 4k + max settings the framerate drops into the 50s, which is absurd.   

Edit

And I tend to agree with Pemalite, 16 gb is on the low side for total memory.  One of the reasons I think people are expecting too much from the switch 2, 12 gb isn't what it used to be.  Personally I wouldn't build a rig without 32 gb system ram and at least 16 gb dedicated vram.  Even 16 gb vram I think will be low in a couple of years.  

Damn, I hope you're right. I suppose looking at it all like that in one area does warp perspective but seeing that DD2 comparison is wild, I have to question why it's so terrible. 

Seems like the whole "engines are scalable now" thing is falling apart. 

FSR seems to be the problem. I wonder can that get better this generation. Like why doesn't Sony work with AMD on that POS. 

FSR is pretty shit, no doubt.  But Square has terrible engines.  None of their games run well as they should, certainly not like Insomniac games.  Hell, Remake on the PC for 1440p is a rtx 1080 which is absurdly more powerful than a ps4.  Square is just technically incompetent at this point. 



Chrkeller said:
LegitHyperbole said:

Damn, I hope you're right. I suppose looking at it all like that in one area does warp perspective but seeing that DD2 comparison is wild, I have to question why it's so terrible. 

Seems like the whole "engines are scalable now" thing is falling apart. 

FSR seems to be the problem. I wonder can that get better this generation. Like why doesn't Sony work with AMD on that POS. 

FSR is pretty shit, no doubt.  But Square has terrible engines.  None of their games run well as they should, certainly not like Insomniac games.  Hell, Remake on the PC for 1440p is a rtx 1080 which is absurdly more powerful than a ps4.  Square is just technically incompetent at this point. 

Probably so, I suppose these are third party games and there hasn't been a true first party next gen title yet like there was with the ps4, Uncharted 4 released within 3 years of the ps4 and blow skulls open that such a cheap box could do so much. 9th gen is still in the cross gen period for all intents and purposes. 

Naughty Dog said their next game will be openworld or at least it's rumored so I wouldn't even expect a showing like that this time around unless Cory Barlog was on a new IP but it's starting to look like he's on GoW original remakes. 

Perhaps there will be no true next gen feel and the 9th gen will just be this, can't get to 60fps can't really get to great visuals sorta nonsense where they won't just meet in the middle and calibrate a game the best it can be in all aspects. 



LegitHyperbole said:

Damn, I hope you're right. I suppose looking at it all like that in one area does warp perspective but seeing that DD2 comparison is wild, I have to question why it's so terrible. 

Seems like the whole "engines are scalable now" thing is falling apart. 

FSR seems to be the problem. I wonder can that get better this generation. Like why doesn't Sony work with AMD on that POS. 

FSR has been getting "improvements" over time. - I.E. FSR 1.0 vs 2.0 vs 3.0 vs 3.1.
However we need to understand what FSR is... And what FSR isn't.

FSR isn't using machine learning algorithms to enhance image quality... It's using a bunch of different post-process filters like blurring edges on geometry, scaling the image up, then sharpening.
FSR 2.0 started grabbing "information" from previous frames to enhance current and future frames.

And FSR 3.0 take a few extra approaches.

FSR's advantage is that it doesn't require tensor cores or specialized compute, it's cheap, it runs on everything... It could even run on the Xbox 360 if a developer wanted.
Current PS4 and Xbox One games are even leveraging it, which works well as Graphics Core Next is very compute centric GPU architecture.

However... The reason why FSR exists is because AMD's GPU's are a few generations behind nVidia, so AMD needed to "invent" an approach that will run on it's technology until they scaled up hardware that could take a machine learning approach.

DLSS and PSSR uses machine learning, it's an entirely different and superior approach, PSSR is still a generation behind DLSS but there is massive gains over FSR.

FSR has it's place, no doubt... And it's absolutely brilliant on handhelds/integrated graphics due to how cheap it is to implement, it doesn't require additional expensive silicon.



--::{PC Gaming Master Race}::--

Around the Network
LegitHyperbole said:
Chrkeller said:

Damn, I hope you're right. I suppose looking at it all like that in one area does warp perspective but seeing that DD2 comparison is wild, I have to question why it's so terrible. 

Seems like the whole "engines are scalable now" thing is falling apart. 

FSR seems to be the problem. I wonder can that get better this generation. Like why doesn't Sony work with AMD on that POS. 

Honestly I think you're being swayed by aspects of that comparison video that has nothing to do with either console. The high contrast of to the Pro footage looks like it's a result of video post processing/how the footage was captured versus anything inherent to the consoles. I would watch digital foundaries based analysis of Dragons Dogma 2 in isolation. Also then to judge the image without immediate comparison. The image quality is not amazing but it's also not as terrible as you're making out (Series S is another topic). In fact the most shameful thing about its launch state is the lack of proper capped 30fps.

The only PS5 games I've seen with truly bad imagine quality is Wu Kong, Imortals & a and a bunch of games in the performance mode  that were never actually intended to have a 60fps modes but were just chucked in to appease  (jedi survivor,  Alan Wake, FFXVI, VII Rebirth)

Across the board I say developers are pushing feature that this generation of consoles are not ready for above 1440p 30fps. RT global illumination is great for example but the best looking PS5 games do not have it and instead push more traditional rasterised lighting with dynamic blend states. The fact The Last of Us 2 on PS4 is still better looking then many PS5 titles I think highlights a lot of resources going towards expensive modern rendering tricks instead of ensuring the base artwork, image quality and density are up to scratch.

Last edited by Otter - on 22 October 2024

Pemalite said:
LegitHyperbole said:

Damn, I hope you're right. I suppose looking at it all like that in one area does warp perspective but seeing that DD2 comparison is wild, I have to question why it's so terrible. 

Seems like the whole "engines are scalable now" thing is falling apart. 

FSR seems to be the problem. I wonder can that get better this generation. Like why doesn't Sony work with AMD on that POS. 

FSR has been getting "improvements" over time. - I.E. FSR 1.0 vs 2.0 vs 3.0 vs 3.1.
However we need to understand what FSR is... And what FSR isn't.

FSR isn't using machine learning algorithms to enhance image quality... It's using a bunch of different post-process filters like blurring edges on geometry, scaling the image up, then sharpening.
FSR 2.0 started grabbing "information" from previous frames to enhance current and future frames.

And FSR 3.0 take a few extra approaches.

FSR's advantage is that it doesn't require tensor cores or specialized compute, it's cheap, it runs on everything... It could even run on the Xbox 360 if a developer wanted.
Current PS4 and Xbox One games are even leveraging it, which works well as Graphics Core Next is very compute centric GPU architecture.

However... The reason why FSR exists is because AMD's GPU's are a few generations behind nVidia, so AMD needed to "invent" an approach that will run on it's technology until they scaled up hardware that could take a machine learning approach.

DLSS and PSSR uses machine learning, it's an entirely different and superior approach, PSSR is still a generation behind DLSS but there is massive gains over FSR.

FSR has it's place, no doubt... And it's absolutely brilliant on handhelds/integrated graphics due to how cheap it is to implement, it doesn't require additional expensive silicon.

How come AMD can't use machine learning on their GPUs until they scale up their hardware, but Sony using an AMD APU can use machine learning and get way better results on AMD hardware than AMD with FSR? Clearly the hardware is capable of it if Sony is pulling it of.



BraLoD said:
Pemalite said:

FSR has been getting "improvements" over time. - I.E. FSR 1.0 vs 2.0 vs 3.0 vs 3.1.
However we need to understand what FSR is... And what FSR isn't.

FSR isn't using machine learning algorithms to enhance image quality... It's using a bunch of different post-process filters like blurring edges on geometry, scaling the image up, then sharpening.
FSR 2.0 started grabbing "information" from previous frames to enhance current and future frames.

And FSR 3.0 take a few extra approaches.

FSR's advantage is that it doesn't require tensor cores or specialized compute, it's cheap, it runs on everything... It could even run on the Xbox 360 if a developer wanted.
Current PS4 and Xbox One games are even leveraging it, which works well as Graphics Core Next is very compute centric GPU architecture.

However... The reason why FSR exists is because AMD's GPU's are a few generations behind nVidia, so AMD needed to "invent" an approach that will run on it's technology until they scaled up hardware that could take a machine learning approach.

DLSS and PSSR uses machine learning, it's an entirely different and superior approach, PSSR is still a generation behind DLSS but there is massive gains over FSR.

FSR has it's place, no doubt... And it's absolutely brilliant on handhelds/integrated graphics due to how cheap it is to implement, it doesn't require additional expensive silicon.

How come AMD can't use machine learning on their GPUs until they scale up their hardware, but Sony using an AMD APU can use machine learning and get way better results on AMD hardware than AMD with FSR? Clearly the hardware is capable of it if Sony is pulling it of.

I'm far from an expert but I think AMD means well.  I think FSR works on most all hardware.  It is more versatile but 'open source' means lower quality.



BraLoD said:

How come AMD can't use machine learning on their GPUs until they scale up their hardware, but Sony using an AMD APU can use machine learning and get way better results on AMD hardware than AMD with FSR? Clearly the hardware is capable of it if Sony is pulling it of.

I think PS5 Pro has tech from next iteration of AMD's GPUs, with hardware that PSSR is running on, that base PS5 does not have.

AMD's GPU series (8000) will have that hardware, so I'm guessing something similar to PSSR will be their counterpart to DLSS.



HoloDust said:
BraLoD said:

How come AMD can't use machine learning on their GPUs until they scale up their hardware, but Sony using an AMD APU can use machine learning and get way better results on AMD hardware than AMD with FSR? Clearly the hardware is capable of it if Sony is pulling it of.

I think PS5 Pro has tech from next iteration of AMD's GPUs, with hardware that PSSR is running on, that base PS5 does not have.

AMD's GPU series (8000) will have that hardware, so I'm guessing something similar to PSSR will be their counterpart to DLSS.

Maybe it also has some separated processors they developed themselves to do the machine learning part and it has nothing to do with AMD.