So I misunderstood you saying the GPU also send unused power to CPU as well (on PS5 it seemed an indication that only CPU to GPU path for this was made).
Still when he said 10% less power could be achieved with dropping only couple percentage on the frequency and performance drop would be minimal then it remain to be seem how that will work when games release. Because right now could be damage control and it will drop more than the 2% (perhaps to the leaked 9.2).
SmartShift works in both directions.
And you are right that 10% less power might only reduce the clockrate by a small percentage... All processors have an "efficiency curve" and once you exceed that curve... A corresponding increase in clockrate can increase power consumption by orders of magnitude.
I.E. Polaris was pretty efficient at 1120Mhz core clock @150w TDP.
But what AMD later did was take that exact GPU, clock the GPU up to 1257Mhz with the RX 580 which is an increase of 137Mhz or 12.2%.
However TDP went from 150w to 185w which is an increase of 35W or 23.33%... So the GPU by extension was regarded as "less efficient".
AMD of course would still take that same chip, shrink it to 12nm (Which is basically a refined 14nm process anyway, fuck marketing bullshittery.) but boosted clockrates to 1469Mhz which is an increase of 31% over the RX 480 and 16.8% over the RX 580. - But TDP increased to 225W which is an increase of 50% over the RX 480 and 21.6% over the RX 580...
The fact is though, we don't know exactly what kind of hit to clockrates we are seeing with an accompanying decrease in TDP on the Playstation 5, we aren't privy to that information just yet.
This was one of the points Cerny should definitely NOT have made in his talk, because it confuses people more than necessary. I guess the temptation of inserting a little pr with a buzzword was too high, so he added the Smartshift bit.
Because you know more than Cerny, right?
You are happy to use "Cerny said this" to backup your arguments... But then downplay Cerny's comments when they come into contradiction of your own.
Again you don't understand "maximum" in that context. Both cpu and gpu could run faster than 3.5GHz and 2.23GHz (obviously as the cpu in the XSX runs at 3.66GHz). The capping runs both cpu and gpu BELOW the maximum clock rates possible, in order to maintain the power limit set by the cooling (which I really want to see).
If you are asserting that the Playstation 5 will operate it's CPU and GPU at higher than 3.5Ghz and 2.23Ghz respectably... Then that is a bold assertion and you need to prove it.
Now if you ran Pong! on the PS5, the power draw on the cpu would be almost nil and Smartshift could in theory send dozens of Watts to the gpu. On the other hand, to save power, the gpu would probably be downclocked anyways so as to cool the entire system. It's Pong!, after all.
That is not what Smartshift does.
It is an allocation of Power/Thermal limitations, not a reduction.
As the PS5 cooling system is apparently laid out to let both cpu and gpu simultaneously run at capped speeds (NOT maximum speeds) under "non-Dealth Valley conditions", Smartshift does absolutely nothing (except probably downclocking the cpu to cool the system). When the Tempest hardware and the ssd hardware are in full operations, my guess is downclocking will occur with a possible Smartshift contribution to hold the gpu at capped speed.
Again. Smartshift isn't about downclocking. It's all about dynamic allocating.
So when does Smartshift actually kick in? That is the part that was a little too fuzzy in Cerny's talk. He never answered that unasked question. Looking at the various statements he made about the whole power managment and what happens when and how long, there are a few open questions left.
We know exactly when Smartshift kicks in, that particular AMD technology is well documented at this point.