Can someone simplify the CPU/ GPU thermal ?
My understanding is that because they have a set thermal limit, games will be able to nearly max out the CPU/ GPU for extended periods of time without any concern.
Whereas in a more traditional design, the hardware cooling and thermals need to guess theoretical max loads because of fluctuations?
I don't know.
Basically, you set your aircon to 23 degrees because you know regardless of the sunlight or activities in the room, you'll remain cool. You can party hard during the day because the Aircon has been set to accept all that play.
As opposed to setting it to 25 and having to drop the temp because the sunlight and in room activities are warming up the room, thus making the Aircon work harder?
Like I said, I'm trying to understand how it works on a more basic level.
Basically what Cerny tried to explain was...
a) Game are hardly ever both CPU and GPU bottlenecked. Almost always, one of them is maxed out while the other one is partially idle. It's also really difficult to fully utilize an 8-core CPU, even if only 7 or 6 of them are available to developers, in which, not all cores are maxxed out 99% of the time.
b) When the CPU alone is maxxed out, it is either because the work is not sufficiently parallel and one are two cores are fully utilized, in which case, power budget is still underutilized and the extra power is directed to the GPU.
c) or because of the high level of parallelization, then the CPU frequencies are lowered but it is not as much of an issue as parallelization achieves a much level of efficiency.
d) When the GPU is maxxed out, the CPU is usually under-utilized.
In short, for the cases of b & d, thermal constraint is not an issue for 99% of the time. For the case of c, it is an issue about 5% of the time. Overall, the CPU and the GPU will meet the required frequencies 95-99% of the time. This translates to about 5-10% deficiency in terms of CPU and 15-20% deficiency in terms of GPU compared to XBSX, which is, in the whole scheme of things, negligible for developers. However, the 2.3x difference in SSD is not, which is gonna be a game changer for PS5.
Regional Analysis (only MS and Sony Consoles)
Europe => XB1 : 23-24 % vs PS4 : 76-77%
N. America => XB1 : 49-52% vs PS4 : 48-51%
Global => XB1 : 32-34% vs PS4 : 66-68%