By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - PS5 GDC Reveal and PS5 specs/performance Digital Foundry Video analysis : 3.5 Ghz 8 core Zen 2 CPU along with 10.3 TF RDNA 2 RT capable and 16GB GDDR6 RAM and also super crazy fast 5.5 GB/Second S

 

How do you feel

My brain become bigger su... 21 30.00%
 
I am wet 6 8.57%
 
What did he talked about??? 5 7.14%
 
I want some more info 9 12.86%
 
Total:41
the-pi-guy said:
https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

https://youtu.be/KfM_nTTxftE

A good example of frequency vs core count



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
the-pi-guy said:
https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

https://youtu.be/KfM_nTTxftE

"And at the nuts and bolts level, there are still some lingering question marks. Both Sony and AMD have confirmed that PlayStation 5 uses a custom RDNA 2-based graphics core, but the recent DirectX 12 Ultimate reveal saw AMD confirm features that Sony has not, including variable rate shading."

This part seems to give credential to the post @Pemalite made that it is quite possible PS5 was designed on RDNA1 with some features of RDNA2.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

CrazyGPU said:
the-pi-guy said:
https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

https://youtu.be/KfM_nTTxftE

That video shows that frecuency has diminishing returns after a point and the more CUs of XBOX series X will make a difference. I think Sony strategy is pricing. 

It's a balancing act.

At some point driving frequency will not net you a linear increase in performance as bottlenecks in the design come into play. (I.E. Cache sizes.)

And the same with just adding more CU's, you eventually reach a point where you won't have a linear increase in performance as bottlenecks come into play. (I.E. number of ROPS.)

Digital Foundry was just demonstrating one aspect of this issue... And I wouldn't be using it as gospel of how the Xbox Series X/Playstation 5 scales in terms of clockrate or CU counts because neither console is using the exact same RDNA hardware as the Radeon demonstrated in that video so scaling will be completely different.

Sony and Microsoft have spent years working with AMD on fine-tuning their designs, they made the best decisions to meet their design goals and get the most performance possible, both consoles are great pieces of kit.

DonFerrari said:
the-pi-guy said:
https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

https://youtu.be/KfM_nTTxftE

"And at the nuts and bolts level, there are still some lingering question marks. Both Sony and AMD have confirmed that PlayStation 5 uses a custom RDNA 2-based graphics core, but the recent DirectX 12 Ultimate reveal saw AMD confirm features that Sony has not, including variable rate shading."

This part seems to give credential to the post @Pemalite made that it is quite possible PS5 was designed on RDNA1 with some features of RDNA2.

Pretty much just waiting on more info before I do a full breakdown on that... And what it could potentially mean for gaming... I do tend to get "information" before most people because of the Enthusiast tech circles I engage with.

Last edited by Pemalite - on 02 April 2020

--::{PC Gaming Master Race}::--

Pemalite said:
CrazyGPU said:

That video shows that frecuency has diminishing returns after a point and the more CUs of XBOX series X will make a difference. I think Sony strategy is pricing. 

It's a balancing act.

At some point driving frequency will not net you a linear increase in performance as bottlenecks in the design come into play. (I.E. Cache sizes.)

And the same with just adding more CU's, you eventually reach a point where you won't have a linear increase in performance as bottlenecks come into play. (I.E. number of ROPS.)

Digital Foundry was just demonstrating one aspect of this issue... And I wouldn't be using it as gospel of how the Xbox Series X/Playstation 5 scales in terms of clockrate or CU counts because neither console is using the exact same RDNA hardware as the Radeon demonstrated in that video so scaling will be completely different.

Sony and Microsoft have spent years working with AMD on fine-tuning their designs, they made the best decisions to meet their design goals and get the most performance possible, both consoles are great pieces of kit.

DonFerrari said:

"And at the nuts and bolts level, there are still some lingering question marks. Both Sony and AMD have confirmed that PlayStation 5 uses a custom RDNA 2-based graphics core, but the recent DirectX 12 Ultimate reveal saw AMD confirm features that Sony has not, including variable rate shading."

This part seems to give credential to the post @Pemalite made that it is quite possible PS5 was designed on RDNA1 with some features of RDNA2.

Pretty much just waiting on more info before I do a full breakdown on that... And what it could potentially mean for gaming... I do tend to get "information" before most people because of the Enthusiast tech circles I engage with.

Will look for your breakdown and also to compare with whatever "professionals" do, because except key folks on DF and Ars Technica I wouldn't trust other "pros", so you and CGI would be a good comparison from SW and HW side of what to expect.

From "Pros" we got from negilible difference, to PS5 is the most revolutionary console, to XSX advantage is massive. While it mostly is XSX is more powerful but the around 15% power difference won't be noticeable for most gamers since the output on the TV will be 4k30fps or the like.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Cerny should have stayed quiet. Making statements without clear facts was a real mistake. The DF video didn't help him or Sony for that matters.

Mark Cerny: "....PS5 can sustain GPU and CPU at maximum frequency most of the time...."

This doesn't answer anything, and if something it just made it worse, by evading a clear answer, one that we all know or suspect.
The previous assumption that the PS5 will not always sustain a 2.23Ghz frequency is then Correct.

What bothers me is the "most of the time" comment. Man please define "sustain" and "most", and then please explain to me what are the counter balance factors or performance drop estimates when the GPU/CPU frequency goes down. Also what is the base clock? For Christ sake, just say it, is just 9.2 Tflops, period. I have no problem accepting the PS5 as a 9.2TF system with a peak performance of 10.2TF if it ends cheaper and managing 4k30/60 very well.

In addition, when the DF guy talked (at 07:06) about Spiderman game engine rendering the sky he clearly says that in a similar situation the ps5 would have a higher frequency because the GPUs is not fully occupied, but this is not the case here because that by design the PS5 boost works by assuming the GPU is occupied for the entire frame (to avoid race to idle). What I understand by this is that the PS5 will offer 2.23ghz as long as it can maintain the given power budget(CPU+GPU fixed power consumption for optimal cooling solution), once the workload goes higher than what the total power budget is then the GPU will have to scale down. So in the end, frequency will increase to meet demand and scale down when not in use or when the workload is higher than the capacity available.

This point was proven too when they talked about Doom Eternal engine (at 08:51), "..but I Digress, the point is the clock is variable, it will be going lower in some scenarios...", This is the absolute proof, and my main concern.

What are the tools or built in measures given to developers to avoid bottle-necking the system by throttling down the CPU or GPU?
This will be a real problem for them at launch. They will have to either sacrifice rendering quality or max resolution to achieve a constant level of detail across the entire gaming experience. I would like to hear them talk about new apis and new code down to the metal techniques that could improve performance needed in a given moment or scenario, as these can be a game changer for them. Take Nvidia DLSS 2.0 feature for example, it can increase performance by up to 33% without visually degrading the picture.

I suppose the end result will not be that much different from XBOX SX in multi platform titles, if developers are given the proper tools and experience with the PS5 dev kits.
I don't doubt that most of the exclusive PlayStation developers will extract every bit of performance and put the PS5 on par with XBox SX.

I love Sony and I don't care if it is the weaker system, but the approach of blinding facts and details, makes me hesitant about the PS5.

And where the heck is Ps1/Ps2/Ps3 BC. I don't see Cerny discarding it. I would like a clear position on this. Yes or No, is that simple.



Around the Network
alexxonne said:
Cerny should have stayed quiet. Making statements without clear facts was a real mistake. The DF video didn't help him or Sony for that matters.

Mark Cerny: "....PS5 can sustain GPU and CPU at maximum frequency most of the time...."

This doesn't answer anything, and if something it just made it worse, by evading a clear answer, one that we all know or suspect.
The previous assumption that the PS5 will not always sustain a 2.23Ghz frequency is then Correct.

What bothers me is the "most of the time" comment. Man please define "sustain" and "most", and then please explain to me what are the counter balance factors or performance drop estimates when the GPU/CPU frequency goes down. Also what is the base clock? For Christ sake, just say it, is just 9.2 Tflops, period. I have no problem accepting the PS5 as a 9.2TF system with a peak performance of 10.2TF if it ends cheaper and managing 4k30/60 very well.

In addition, when the DF guy talked (at 07:06) about Spiderman game engine rendering the sky he clearly says that in a similar situation the ps5 would have a higher frequency because the GPUs is not fully occupied, but this is not the case here because that by design the PS5 boost works by assuming the GPU is occupied for the entire frame (to avoid race to idle). What I understand by this is that the PS5 will offer 2.23ghz as long as it can maintain the given power budget(CPU+GPU fixed power consumption for optimal cooling solution), once the workload goes higher than what the total power budget is then the GPU will have to scale down. So in the end, frequency will increase to meet demand and scale down when not in use or when the workload is higher than the capacity available.

This point was proven too when they talked about Doom Eternal engine (at 08:51), "..but I Digress, the point is the clock is variable, it will be going lower in some scenarios...", This is the absolute proof, and my main concern.

What are the tools or built in measures given to developers to avoid bottle-necking the system by throttling down the CPU or GPU?
This will be a real problem for them at launch. They will have to either sacrifice rendering quality or max resolution to achieve a constant level of detail across the entire gaming experience. I would like to hear them talk about new apis and new code down to the metal techniques that could improve performance needed in a given moment or scenario, as these can be a game changer for them. Take Nvidia DLSS 2.0 feature for example, it can increase performance by up to 33% without visually degrading the picture.

I suppose the end result will not be that much different from XBOX SX in multi platform titles, if developers are given the proper tools and experience with the PS5 dev kits.
I don't doubt that most of the exclusive PlayStation developers will extract every bit of performance and put the PS5 on par with XBox SX.

I love Sony and I don't care if it is the weaker system, but the approach of blinding facts and details, makes me hesitant about the PS5.

And where the heck is Ps1/Ps2/Ps3 BC. I don't see Cerny discarding it. I would like a clear position on this. Yes or No, is that simple.

Seems like you ignored what was asked and answered and just decided to vent.

He is quite clear that there are situations where the GPU would be idle so there is no point in raising frequency, he also point out that the devkit gives feedback on the utilization of GPU/CPU and frame so they can optimize and do the same with less resources (or keep resources and fill with more work).

Last edited by DonFerrari - on 02 April 2020

duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

He wasn't clear. You are being clear when you talk with hard facts, not when you're sugar coating them with technobabble.

Mark Cerny (35:28, The Road to PS5)..."We went with a variable frequency strategy for PS5 which is to say we continuously run the gpu in boost mode"

If we take this literally, then is not true. PS5 will not run continuously in Boost mode, it will try. It will go up or down.

Why choose the words, like continuously, boost and variable, if the frequency will not be sustained. This confuses people. Just search around all the articles written because of this. His words were carefully chosen, so PS5 doesn't end trivialized as the weaker system, and I understand that. But I prefer a more honest approach. Something like..."we choose to design PS5 for a target price of 399 in mind and great 4k30/60 fps experiences while maintaining the accessibility that PlayStation family has always provided to traditional console gamers and newcomers...", would have been much better digested and understandable.

PS5 is a 9.2TF base system. Whether you want to believe or not, is your choice. But early leaks and on frequency(2ghz) and gpu codenames benchmarks pointed to a 9.2TF system, the same system if given a 2.23ghz clock resulting in 10.2TF. Hell...Digital Foundry talked about this in 3 separate videos.

DonFerrari said:

Seems like you ignored what was asked and answered and just decided to vent.

He is quite clear that there are situations where the GPU would be idle so there is no point in raising frequency, he also point out that the devkit gives feedback on the utilization of GPU/CPU and frame so they can optimize and do the same with less resources (or keep resources and fill with more work).

I did not. That is why he talked about traditionally game engines taking into account gpu race to idle factors, with the ps5 is not the same because the system will trigger a boost for the entire frame (30 or 60), even if the workload doesn't achieve max power. I gave the references. Watch them.

Also, giving feedback is not enough. You need new tools to better handle this different approach. You can't expect from a developer to design XBox Sx and Ps5 the same, if both achieve mas peak power very different. That is why is much important for the PS5 SDK to support new tools, features and apis to better balance the ps5 system. And yet they haven't talked about this. And it will be the secret sauce, if any.

Last edited by alexxonne - on 02 April 2020

alexxonne said:

He wasn't clear. You are being clear when you talk with hard facts, not when you're sugar coating them with technobabble.

Mark Cerny (35:28, The Road to PS5)..."We went with a variable frequency strategy for PS5 which is to say we continuously run the gpu in boost mode"

If we take this literally, then is not true. PS5 will not run continuously in Boost mode, it will try. It will go up or down.

Why choose the words, like continuously, boost and variable, if the frequency will not be sustained. This confuses people. Just search around all the articles written because of this. His words were carefully chosen, so PS5 doesn't end trivialized as the weaker system, and I understand that. But I prefer a more honest approach. Something like..."we choose to design PS5 for a target price of 399 in mind and great 4k30/60 fps experiences while maintaining the accessibility that PlayStation family has always provided to traditional console gamers and newcomers...", would have been much better digested and understandable.

PS5 is a 9.2TF base system. Whether you want to believe or not, is your choice. But early leaks and on frequency(2ghz) and gpu codenames benchmarks pointed to a 9.2TF system, the same system if given a 2.23ghz clock resulting in 10.2TF. Hell...Digital Foundry talked about this in 3 separate videos.

DonFerrari said:

Seems like you ignored what was asked and answered and just decided to vent.

He is quite clear that there are situations where the GPU would be idle so there is no point in raising frequency, he also point out that the devkit gives feedback on the utilization of GPU/CPU and frame so they can optimize and do the same with less resources (or keep resources and fill with more work).

I did not. That is why he talked about traditionally game engines taking into account gpu race to idle factors, with the ps5 is not the same because the system will trigger a boost for the entire frame (30 or 60), even if the workload doesn't achieve max power. I gave the references. Watch them.

Also, giving feedback is not enough. You need new tools to better handle this different approach. You can't expect from a developer to design XBox Sx and Ps5 the same, if both achieve mas peak power very different. That is why is much important for the PS5 SDK to support new tools, features and apis to better balance the ps5 system. And yet they haven't talked about this. And it will be the secret sauce, if any.

It will be continuous mode. It isn't 9.2 or 10.28 Tflop or 2Ghz to 2.23Ghz. The frequency will vary continuously as needed for both CPU and GPU. And most time it will be on he maximum, his words, if you don't want to believe in it then there would be no point in he giving any explanation as none would satisfy you.

Price isn't defined so he wouldn't talk about that in anyway.

About the boost, base or leaks, that is just non-sense, this wasn't developed in a week or month to beat Xbox. It is the philosophy they gone with the console for a long time.

I have never seem any GDC talk or any other that gives details on the developing tool. He said on the GDC video that the time for a team that dev for PS4 to get on with PS5 is the shortest ever, like 1 month. He also said that most of the features can be ignored by dev or will be performed invisible to them. And also said that the tool will allow  for optimization with the best way the package can be made to fully utilize the system and dev will be able to continuously monitor it. That seems enough detail to show the devkit was improved to help it, he talks about devs not wanting uncertainty be it from one PS5 to another working differently due to the change of frequency and thermals, so you can rest on it without worrying.

PS4Pro was a lot weaker than X1X and Sony didn't had any need to hide it, PS4 was a lot stronger than X1 and Sony also didn't make much advertising of it, so not sure why you think they are trying to hide the console weakness.

You just want more information than you got and is flustered because they are not giving what you want when you want.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

the-pi-guy said:

1.) We don't know what the base clock is.  We don't even know if there even is a base clock, because the paradigm for how the system chooses a frequency is completely opposite of the norm. 

2.)  From Cerny's comments, we know it spends the majority of it's time near the max frequencies.  More than likely it'll be closer to a 10 TF machine for games.  

Actual PS5 (10.28 TF / 2.23ghz gpu) = 4.609865 performance factor

Leaked PS5 Oberon specs  (9.2 TF/ 2ghz gpu). Using the actual PS5 performance factor (2ghz x 4.609865) it gives 9.2TF, the same as the leaks. They are the same. You are in denial, if you don't see it, but you have the right to choose so.

That is why in my opinion Cerny's words were technobabble, because at the end neither of the words spoken matters. Is about performance. Just that.

I remember when Hyper-threading came for P4 CPUs, benchmarks were through the roofs. I bought one the 1st day at a great cost. At the end a cheap Athlon cpu handled games better, because games  engines weren't optimized for HT. It took years for developers to properly enhance them and OS to properly manage threads, specially when multi core cpus started to appear.

If Cerny wants to justify the lack of performance by using a variable freq. and you are ok with that, great. But I'm not, to me is a cheap excuse.

If they intended to deliver developers information then they're very late. Developers are already building games.

The conference was made to all in general. Hell it came from the Playstation Blogspot.

Last edited by alexxonne - on 02 April 2020

the-pi-guy said:

If in denial means sticking to the facts, instead of hypothesizing based off incomplete data, then yeah I'll be in denial all day.

You didn't refute either comment I made.  

I don't need to refute something that you already accepted are in denial, and I don't intent to argue.

You have your opinion and I respect that. I have mine, I expect the same.

But as hard facts goes even Digital Foundry article say it, and they are the ones with the most expertise.

The worst part, and this is a real speculation from my part, is that base clock/performance could be lesser than 2ghz/9.2TF, but gladly this will not matter too much given the build focus on boost and peak performance, and that gives Sony something.

Last edited by alexxonne - on 02 April 2020