By using this site, you agree to our Privacy Policy and our Terms of Use. Close
ethomaz said:

disolitude said:

7970m for 100 bucks? I want what youre smoking... 7970m is a 200 dollar upgrade on most laptops over 7800 series gpu.

Do you know how is the cost of a single gpu or cpu chip? $200 dollars is the retail price for us consumers... even so the same chip is used for other models so a $100 chip is used for $200 - $500 products... all depends the potential of clock.

I will tried do explain to you how much a single chip cost to be manufactured...

1. You need to know there is no cost for manufature a single gpu/cpu chip... the cost is to manufacture one wafer of chips.... the price of a single wafer 300mm 28nm today manufactured by TSMC is ~ $5000... that's fixed for any chip complex or not.

2. What define the final cost of a single chips is how much "good" chip can be made in this wafer... so if the chip is simple and small the wafer can have more chips but if it is complex and big the wafer have less chip... eg. a 300mm 65nm wafer can have 94 NVIDIA GT200 and a 300mm 45nm wafer can have 2500 Atom processors... of couse that is the max number of chip that fit the size of these wafers but you need to remember the "bad" chip no used for nothing... in a simple chip the lost with "bad" chips is low (less than 10% of the chip manufactured in a wafer) but if a chip is complex and the prodcution is not that good yet you have a lot of "bad" chips like the GT200 or Cell (near 50% of the manufactured chips in a wafer was lost).

The NIVIDA GT200 was a big and complex chip that cost to be make ~$112 per chip... of the 94 chips near 40 was lost... so the wafer give you just 50 chips in the end... the GT200 equiped two retail products the GTX 280 ($600) e GTX 270 ($300)... in this case NVIDIA choose to use the chips with at least 480 SPs usable because less than 10% of the chips have the full 512SPs usables... a wafer with 90% of the chips bad is impossible to make money with it... so at least 60% of the chips have 480SPs usables... so NVIDIA never used the full GT200 power.

I give that example because the GT200 was the first chip to cost more than $100... it size was 500-600mm² (a monster). So any chip with less than 500mm² make in 55nm was chepear than the GT200 (< $100).

That $100 for GT200 is based in a 55nm process with a wafer cost of $8000... the wafer cost for a 28nm is $5000... so cheaper... so the same chip cost is below $50 in 28nm.

3. Now talking about PS4... the estimated size for the 7970M GPU is 212 mm² in 28nm... a 300mm wafer can have ~300 200mm² chips (all depends how is the chip... square, rectangular, etc)... so a $5000 wafer can have ~300... now all depends how much chips are good for use:

* 90% of the chips are good: $5000 / 270 chips = $19 per chip
* 70% of the chips are good: $5000 / 210 chips = $24 per chip
* 50% of the chips are good: $5000 / 150 chips = $34 per chip

That's the price of a single 7970M chip... not the full video card or the retail price for consumers.

4. PS4 yet and how Sony can buy or manufacture the 7970M chip... eg with 70% of the chips good.

+ Sony can buy the project and manufacture itself (like Microsoft did with R500): ~$25 per chip
+ Sony can ask to AMD manufacture they: ~$40-45 per chip (AMD uses GlobalFoundries... so ~$10 for each company per chip)
+ Sony can buy the project and manufacture in TSMC: ~$35-40 per chip (Sony need just to pay to TSMC)
+ Sony can ask to AMD manufacture in TSMC: ~$45-50 per chip (same than AMD plus GlobalFoundries but I think the TSMC a little more expensive)

In any case it's impossible a 212 mm² in 28nm using 300mm wafer costs to Sony over $50 per chip... it is just impossible... unless all the chips less than 30% are good and usable.

So what am I smoking??? I think the guys here knows nothig about how much a CPU/GPU chip costs.

 

Said that... I think the full chip CPU + APU + GPU together will be a 400mm²... bigger, more complex, less chips per wafer are good... so I think a cost for Sony from $60 to $80 per chip with a ~50% of good chips... I put in my estimate to you $200... it's a overcost with all needed to put this chip woking in the console motherboard... even the components to stay it cool and below the critical temperatures.

 

And yes... AMD e Intel sells the same $50 chips in CPU models from $200 to $999 (the highest potetial clock chips equipcs the top $999 cpus).

This is a very informative post covering the cost of wafer use and chip manufacturing.

However your pricing doesn't take in to account the real cost of a GPU, CPU or any chip...its development. Skilled staff, research and development, RTL hardware guys, testing, drivers, SDKs/APIs, marketing, paying license fees, patent disputes, equipment, etc. The more complex the chip, the higher the R&D costs. A 40 nm chip is used for a AMD 6450 that retails for 39 bucks, while 40 nm 6970 retailed for $399. According to your logic both are 40 nm so they must cost the same right? Even with different yields, where does the 10X markeup come from. R&D for the 6970 was much higher than 6450 hence the higher cost to OEM's and consimers.

Your post seems to think that just because AMD can push GPUs at $24 per chip at 70% yeild rate, that is the cost for AMD to sell that GPU to Sony (+20 dollars  for manufacturing apparently). The very first 7900m GPU cost AMD 100s of millions of dollars to develop and they have to recoup that cost somehow and hopefully make some money on it.

Now I know it's not $300 per unit for Sony to use this GPU, but my post said "cost of parts at retail"... This is the only information we can use unless someone has inside scoop at Sony and AMD. Everything else at this point is pure speculation.