By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Persistantthug said:

But you don't know what the wattage or the cost of a custom GPU chip.  Keep in mind that I said chip....because unlike what you might buy for your PC with a 7950 or 7970, for example, Consoles only use the chip....not the whole board.

The board/PCB is one of the least expensive components of the graphics card.  The thermal/cooling system, actual GPU chip, dedicated VRAM are where the bulk of the costs are and all of those have to be in the console. This is why most people don't care for the distinction between the total cost of a console GPU and the Graphics Card on the PC.  Here is the breakdown of all the actual components costs for GPUs from last generation. It's not directly comparable to HD7970/GTX680 but it gives you an idea what AMD/NV charge to Add-In Board Partners (AIBs) for each part.

Replace HD6950 with HD7950 to get your 3TFlop+ estimate and you are looking at $150-160 for the graphics card kit, purchased directly from AMD. But that's assuming HD7950 costs as much as HD6950 to manufacture. It likely costs more because HD6950 debuted at $299, while HD7950 launched at $449. AMD's GPU division isn't exactly making money hand over fist, which seems to indicate that manufacturing costs are higher for current gen than HD6xxx series. There have also been many articles and NV's presentation where it has been stated that cost per wafer increases from 40nm to 28nm generation. That means 28nm GPUs cost AMD substantially more $ to fab than HD6950/6970 series. This is actually known since TSMC raised 28nm wafer prices vs. 40nm generation. Anything faster than HD7870 2GB (HD7970M) sounds like wishful thinking in a PS4/720 console due to the cost and power consumption limitations, unless they take something like an HD7950/8950 and gimp it on the ROP/memory bandwidth side (but then it won't be a high-end GPU).

HD7870 has 20 Compute Units and each Compute Unit has 64 Steam Processors. If they remove 2 Compute units (Red GCN blocks in the diagram below) and drop the GPU clocks from 1000mhz to 800mhz, you end up with a 10% cut-down HD7870 with 20% reduction in GPU clock speed:

18 Compute Unit custom "Pitcairn" HD7870 in PS4:

18 CUs *64 SPs = 1152 Steam Processors @ 800mhz GPU clock speed x 2 Floating Point Ops/clock = 1.84Tflops floating point. 

^^^ A slightly cut-down HD7870 actually sounds like a very reasonable rumor for PS4 because these specs are very similar to AMD's fastest single GPU in the mobile space -- HD7970M:

1280 SPs @ 850mhz

http://www.notebookcheck.net/AMD-Radeon-HD-7970M.72675.0.html

If we consider that PS4 is not going to be the size of a mid-tower PC, we can deduce that thermal/heat dissipation limits and power consumption are much more important. In the smaller form factor of PS4, using a cut-down Pitcairn is akin to using near top-of-the line mobile GPU in AMD's laptop product stack. I'd say that's a pretty reasonable balance of power consumption, performance and cost. There is a dramatic size difference between HD7870 "Pitcairn" at 212mm2 vs. HD7950/7970 "Tahiti XT" at 365mm2 chip (72% larger chip). If you are buying a 300mm wafer, the price per wafer is actually fixed, but you can fab more of the smaller-sized chips. Thus, your cost to manufacture each chip is less and yields are higher since larger chips tend to have more manufacturing flaws/harder to fab. Over millions of chips, these costs and yields add up.

http://techreport.com/review/22573/amd-radeon-hd-7870-ghz-edition

@ superchunk,

For PS4 Display Support, I think 4K TV upscaling resolution will be supported. I think Sony will want to market the PS4 as the next gen multi-media device. Since HD7000 series supports 4K Displays, it's not unreasonable to believe that PS4 could upscale movies to 4K, support 4K BluRay, etc. PS4 could still have 4K media capability even if games run at 1080P only.