By using this site, you agree to our Privacy Policy and our Terms of Use. Close
ethomaz said:

disolitude said:

This is a very informative post covering the cost of wafer use and chip manufacturing.

However your pricing doesn't take in to account the real cost of a GPU, CPU or any chip...its development. Skilled staff, research and development, RTL hardware guys, testing, drivers, SDKs/APIs, marketing, paying license fees, patent disputes, equipment, etc. The more complex the chip, the higher the R&D costs. A 40 nm chip is used for a AMD 6450 that retails for 39 bucks, while 40 nm 6970 retailed for $399. According to your logic both are 40 nm so they must cost the same right? Even with different yields, where does the 10X markeup come from. R&D for the 6970 was much higher than 6450 hence the higher cost to OEM's and consimers.

Your post seems to think that just because AMD can push GPUs at $24 per chip at 70% yeild rate, that is the cost for AMD to sell that GPU to Sony (+20 dollars  for manufacturing apparently). The very first 7900m GPU cost AMD 100s of millions of dollars to develop and they have to recoup that cost somehow and hopefully make some money on it.

Now I know it's not $300 per unit for Sony to use this GPU, but my post said "cost of parts at retail"... This is the only information we can use unless someone has inside scoop at Sony and AMD. Everything else at this point is pure speculation.

I think you are confusing the costs.... we are talking about how much money Sony needs to spend to manufacture each PS4 unit.... the R&D cost are covered in the lifetime of a console with the profit of hardware and software.

So if Sony can manufacture a PS4 per $300 and sell it at $350 then they will have a profit of $50... if Sony manufature the PS4 per $400 and sell it at $350 then Sony is losing money in each PS4 sold... just it.

The cost of manufacture a console is the cost I tried to explain to you using the CPU/GPU/APU chip... of couse there are other chips, HDD drive, BD drive, motherboard, etc... but a GPU chips not cost $200 to manufacture it's easly one of the most cheaper parts of a console... the PS3 costed $600 because just the BD drive costed $200... the Cell was expensive when you look at the cost of other CPU/GPU chip but they are not the one that make the PS3 cost $600 because without the BD drive Sony could be sold the PS3 per $450 and loose little money with it.

That's the point... Sony needs to cover all the manufacture cost of the PS4 with the sales to retail to not happen what happened with PS3 (loose on each console)... the other production costs like R&D, marketing, etc can be covered by the software sales.

That's it.


I fully understand your points but I disagree with this logic.

Let's take PS3's lauch for example. Initial estimated cost of PS3 was 840 dollars by isupply.

http://www.edge-online.com/news/isuppli-60gb-ps3-costs-840-produce/

"Some of the more expensive PS3 components that were charted by iSuppli include Nvidia’s $129 RSX graphics chip, the $125 blu-ray disc drive and the $89 IBM Cell processor."

So the RSX which is based on older 90nm tech and offered middle of the road like performance at the time cost Sony $129. The size of the RSX is just under 200mm² so lets say 300 chips per wafer like you stated before.

Because this isn't a cutting edge chip lets say the yields were 90% - $5000 / 270 chips = $19 per chip

So iSupply states that RSX cost Sony $129 bucks yet the chip costs $19 dollars to manufacture each.

How so?