By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Cerny should have stayed quiet. Making statements without clear facts was a real mistake. The DF video didn't help him or Sony for that matters.

Mark Cerny: "....PS5 can sustain GPU and CPU at maximum frequency most of the time...."

This doesn't answer anything, and if something it just made it worse, by evading a clear answer, one that we all know or suspect.
The previous assumption that the PS5 will not always sustain a 2.23Ghz frequency is then Correct.

What bothers me is the "most of the time" comment. Man please define "sustain" and "most", and then please explain to me what are the counter balance factors or performance drop estimates when the GPU/CPU frequency goes down. Also what is the base clock? For Christ sake, just say it, is just 9.2 Tflops, period. I have no problem accepting the PS5 as a 9.2TF system with a peak performance of 10.2TF if it ends cheaper and managing 4k30/60 very well.

In addition, when the DF guy talked (at 07:06) about Spiderman game engine rendering the sky he clearly says that in a similar situation the ps5 would have a higher frequency because the GPUs is not fully occupied, but this is not the case here because that by design the PS5 boost works by assuming the GPU is occupied for the entire frame (to avoid race to idle). What I understand by this is that the PS5 will offer 2.23ghz as long as it can maintain the given power budget(CPU+GPU fixed power consumption for optimal cooling solution), once the workload goes higher than what the total power budget is then the GPU will have to scale down. So in the end, frequency will increase to meet demand and scale down when not in use or when the workload is higher than the capacity available.

This point was proven too when they talked about Doom Eternal engine (at 08:51), "..but I Digress, the point is the clock is variable, it will be going lower in some scenarios...", This is the absolute proof, and my main concern.

What are the tools or built in measures given to developers to avoid bottle-necking the system by throttling down the CPU or GPU?
This will be a real problem for them at launch. They will have to either sacrifice rendering quality or max resolution to achieve a constant level of detail across the entire gaming experience. I would like to hear them talk about new apis and new code down to the metal techniques that could improve performance needed in a given moment or scenario, as these can be a game changer for them. Take Nvidia DLSS 2.0 feature for example, it can increase performance by up to 33% without visually degrading the picture.

I suppose the end result will not be that much different from XBOX SX in multi platform titles, if developers are given the proper tools and experience with the PS5 dev kits.
I don't doubt that most of the exclusive PlayStation developers will extract every bit of performance and put the PS5 on par with XBox SX.

I love Sony and I don't care if it is the weaker system, but the approach of blinding facts and details, makes me hesitant about the PS5.

And where the heck is Ps1/Ps2/Ps3 BC. I don't see Cerny discarding it. I would like a clear position on this. Yes or No, is that simple.