By using this site, you agree to our Privacy Policy and our Terms of Use. Close
drkohler said:

In the PS4Pro, the cpu and gpu seem to be using different clock generators (like the XBox, unlike the PS4). The problem with AMD designs is they use a "pack as much as possible as tightly as possible" philosophy which tends to limit viable speeds rapidly, unlike NVidia that uses kind of a "keep it simple and leave space to breath" design philosophy.

You have made many assumptions.

What Microsoft did with the Xbox One was boost it's "base clock" and not just the individual Multipliers, which meant that everything tied to that clock was boosted, you can bet your hens teeth the PS4 Pro is the same, it's a general consensus both AMD and Intel have come to in their chip designs as a good approach.
Heck, even nVidia reverted back to it with the removal of their seperate shader clock (Which is now tied to the Core clock).

As for AMD and nVidia, nVidia modified chunks of it's architecture to be able to drive up the clock rate on Pascal, AMD did not.
AMD and nVidia do an equal amount of work in trying to pack in as many transisters at every node, by using different types of transisters, optimizing layout to reducing leakage... etc'.
This was most evident during the prolonged 28nm cycle.

Polaris isn't clockrate limited because of how AMD packed it's transisters, nVidia specifically set-out to make Pascal a high-clocking part.


drkohler said:

Generally, the higher the clock rate, the more current you must provide (and remove!).

There is a ton more to it than that.
For instance...
Higher clockrates doesn't always mean more current, there are different types of transisters which have their own efficiency curves, some transisters that operate at higher frequency's are at their most efficient at a high clock rate.

Of course you have the layout and technology's like Resonant Clock Mesh that plays a role as well.

drkohler said:

 At school, we learned that P = I^2*R is the heat dissipated in resistors, and you see it goes with the square of current I. AMD designs have the tendency to "totally lose it" above a certain current. A stock RX480 card already reaches 150W and for a 6TF card, easily passes 200W.

Resistors aren't the same as transisters and thus that equation can't be applied here.

The RX 480 card in a nut shell, was a disaster in terms of Performance/Power consumption, with that said, Teraflops also has no direct correllation with energy consumption.
Remember it is the Shaders * 2 * Clock Rate.

Typically a GPU will be more conservative with it's energy demands the wider it is (Up to a point.)
To think Polaris is going to be any kind of representation of what Scorpio is going to be is pretty silly, considering we have no idea what chip it's going to be derived from or how the chip is going to be fed.

drkohler said:

Then you have to add an 8core "better than Jaguar" cpu which adds 60-90W depending on speed.


As for Jaguar and it's sucessors, keep in mind they are low clocked, low powered parts, Jaguar's successor Puma and Puma+ have tangible benefits, run at the same clocks and have reduced power consumption at the same node.
Carizzo-L based on Puma+ will happily use 12-25w of power in a quad-core configuration, but that's not just for the CPU cores themselves, that includes the graphics units, memory controllers and various other pieces of logic and is thus not representative of how much energy is going to be consumed for the CPU cores themselves.

drkohler said:

For a console, a SoC that draws more than 200W is no longer viable economically.


Sure it is, the PC has invented multiple technology's to aid in more efficient heat removal, such as Vapor-Chambers.
Power supplys have also been able to provide more and more energy at a lower price over the years.

The original Xbox 360 for example had a 203w PSU and had a 198w power consumption, but that was also over a decade ago, things have gotten far better since then. ;)

drkohler said:

The PS4Pro's gpu clock was chosen to be in a safe thermal spot and that seems to be the purported 911MHz. There is a common myth that "you just use a better cooler" but that is a dangerous thought train. Because chips get smaller faster than current decreases, at some point you can no longer dissipate the generated heat away fast enough economically. For consoles, the price budget for cooling probably is in the $8-$12 range.


That used to be true, but we have gotten far better at removing heat, 300w and beyond TDP's are pretty viable now.
Vapor Chambers, Heat Pipes, Phase Change, big 140mm fans, closed loop water coolers... We aren't just dropping a chunk of Aluminum or copper with some thermal compound anymore.


drkohler said:
With consoles, a fixed clock rate is mandatory. Using temporary boost clocks (of whatever kind) would make game performances unpredictable. The one mantra of consoles is that every console must run the same performance at all times.
With consoles, it is always price first, speed second. The price requirement pretty much limits what you can do inside a console.

While I agree that Boost clocks is silly for a console...

But the price argument isn't as relevent this generation with the introduction of higher priced, higher performing more "premium" tiers of hardware.



--::{PC Gaming Master Race}::--