By using this site, you agree to our Privacy Policy and our Terms of Use. Close
ethomaz said:

disolitude said:

I fully understand your points but I disagree with this logic.

Let's take PS3's lauch for example. Initial estimated cost of PS3 was 840 dollars by isupply.

http://www.edge-online.com/news/isuppli-60gb-ps3-costs-840-produce/

"Some of the more expensive PS3 components that were charted by iSuppli include Nvidia’s $129 RSX graphics chip, the $125 blu-ray disc drive and the $89 IBM Cell processor."

So the RSX which is based on older 90nm tech and offered middle of the road like performance at the time cost Sony $129. The size of the RSX is just under 200mm² so lets say 300 chips per wafer like you stated before.

Because this isn't a cutting edge chip lets say the yields were 90% - $5000 / 270 chips = $19 per chip

So iSupply states that RSX cost Sony $129 bucks yet the chip costs $19 dollars to manufacture each.

How so?

lol

1. I'm pretty sure the RSX in 90nm used a 200mm wafer... just the 65nm started to use the 300mm wafer... the 300mm wafer is almost two times bigger than the 200mm... so you can cut that for 150 chips instead 300...

2. I'm pretty sure too the percent of good chips of RSX was below 90%...  less than 130 chips per wafer.

3. The actual 300mm wafer for 28nm costs $5000... the 200mm wafer for 90nm costs $8000-10000 in 2006.

So... $10000  / 130 = $77 but that's not for Sony... Sony needs to pay to  NVIDIA and TSMC to manufacture the chips... the RSX was in 2006 a chip near $100 to manufacture... even more if the quantity of good chips was low.

A 80% or 70% of good chips can give you the $129 estimated by the site.


1. Thats just not true. 300 mm wafers were used since early 2000s and were used even with 130 nm nodes let alone 90 nm. All one has to do is google "300 mm wafer 90 nm". There is no way anyone as big as nVidia would have used 200mm wafer in 2006 due to cost.

Here is a nice pfd about the switch from 200 to 300 mm wafer technology back from 2003.

http://www.google.ca/url?sa=t&rct=j&q=&esrc=s&frm=1&source=web&cd=4&cad=rja&ved=0CFsQFjAD&url=http%3A%2F%2Fwww.nanobuildings.com%2Fbat%2Fpresentations%2Fdownloads%2FMJamison_Presentation.pdf&ei=MC_7UJGiHoaFrAHjg4Bg&usg=AFQjCNHjo_VGZy63bXSRNf94_U8-9K5h_w&sig2=daGiC7s0TT9EEAoyE5_OoA

2. Why would RSX which is not a very good chip in 2006 have lower yields than 7970m which is the best mobile chip money can buy in 2012? RSX is not the Cell or Nvidia Geforce 200 series processor which were cutting edge.  It's a fairly basic GPU...even if the yield wasn't 90%, it has to be high.

3. Wafer cost figures you're quoting...lets see some links. By searching on Google, I see 300mm wafers price quoted anywhere from $2000 dollars to $10000 dollars.

Bottom line is that the manufacturing cost math doesn't add up if you admit that 300 mm wafer was used, which it was. All this "I'm pretty sure" stuff is just a way to spin arguments in your favor.

A second bottom line is that there is no way that RSX (mediocre chip in 2006) cost 129 dollars while a 7970m (amazing chip in 2013) costs less. If it did, everyone would be making powerful consoles, including Nintendo.