|
binary solo said: Question: Are there any advantages to running a GPU at minimum or average clock vs maximum/overclock? I assume a GPU needs to work with all other components of teh system, which means it's operating speeds need to be in sync with CPU etc. So is PS4P going with the (probable) minimum clock more about how the GPU needs to operate in sync with the CPU than there being any inherent benefit to operating at minimum? I assume higher clock means more heat, but that shouldn't matter if you have decent cooling/venting. |
Yes higher clock= more heat.
Anyways, here's the issue. Not all GPUs are made the same. In truth, they are all just a little bit imperfect.
Say Sony can get 400 APUs from a single wafer. The could be paying $32,000 per wafer which comes to around $80/chip. Not all those chips are perfect. And in most cases not all of them can be clocked to their max frequencies because of it. This is what is referred to as the"maturity" of a fab process. With a fab process as young as 14nm is The guyAMD/Sony/MS are using, a great deal of those chips will be "defective".
But that in most cases just means they can't be clocked up to perform as high as they possibly could go or in worse case there are things in them that are just broken.
So Sony/ms or anyone for that matter find workarounds for this. Since you can't guarantee that evey chip can clock up to 1.2Ghz, you clock it down to 910mhz and at least ensure that you can use every single chip (or at least majority of them) that comes off that wafer. Further more you may even disable 1/2 cores. All these things are done to improve yield and thus reduce costs.
If AMD/Ms decides that they are gonna put a 6TF GPU in ever APU, then they would be able to only use the ones that can clock that high and remain stable. So out of the 400 chips on that wafer they may end up being able to use only like 250. But they will STILL pay $32k for the bloody wafer. Even though they have 150 less usable chips.
PS4 is going with the minimum for stability and to reduce costs. AMD can Run the GPUs at a higher clock cause they know they are charging more for just the GPU. If they wanted to clock those GPUs at say 1.2TF as the minimum clock meaning that all GPUs are guaranteed 5.5TF, then it means they will take GPUs that can be over clocked to 1.3GHz and still remain stable for prolonged periods. That in turn reduces they amount of usable GPUs they can get from a wafer. Which drives up the price.







