By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:
fatslob-:O said:
Pemalite said:


No, it belongs to every fabrication process ever used and will ever be used.

There was a reason why Intel was slow to move Atom to a cutting edge lithography, why they used to release it's motherboard chipsets a lithography or two behind it's processors, it was costs and profit margins, it was simply cheaper to use the older lithography.

The reason why Intel was slow to move Atom for cutting edge lithography is cause they didn't want to compromise on higher margin parts but if they had some extra fab capacity then they would've without doubt moved Atom on to more cutting edge lithography instantly ...

The older technology, the cheaper it is mentality doesn't apply to chip manufacturing since newer process nodes have ALWAYS provided cost reduction aside from 28/22nm and below ...

It does apply, again, up to a point.
Maturity, Die Size, Fab Capacity all play a role, it's not as black and white as you think it is.
Not to mention additional R&D is required to shift anything to a new node, which also costs.

AMD for instance has been getting Caicos and Cedar built at 40nm for 4-5 years now, nVidia is pushing out the Geforce 705 and 730 on 40nm, which stems back to the Geforce 400 series.
Atom and Intel Chipsets historically used older nodes for years.

They all have one thing in common, they are all, small and cheap dies even on older process nodes.
Atom only recently started using the latest and greatest lithography due to Intel wanting to balloon it's transister counts to be competitive against ARM.

You also only have finite fab capacity, it doesn't make sense to cut into that capacity with low-margin parts, unless there is a damn good reason.

The funny thing about you guys argument is that you are both right. But also that you are both just not looking at it in a way that it applies to consoles.

For consoles, going to a smaller Fab process, will ALWAYS be cheaper for them. This is simply because unlike everything else, when a console die chrinks, they don't cram in more transistors to make them more powerful. The chip remains identical but just smaller. That simply means that they can make more of those chips per wafer. So where MS/Sony may have been spending $800M for 10M chips before, now they would be spending $800 for 16M or so chips.

The other thing to consider, and this is why console manufacturers may have to wait a little, is that when the fab process shrinks, for the first 2-3 months yeilds are lower. What that simply means is that at this point the chips cost more to make than their bigger counterparts. And only people willing to pay a "defect" premium will still order and build chips at the new size. usually GPUs. There is sort of a pecking order for who gets to use the new fab process first.

By the time the GPU guys, Apple and a couple of other smartphone makers get their own chips, another 8-9 months would have passed and by that time the fab process is usually perfected and yeilds are at a maximum. Then the console guys can come in and make a block order for 16M chips.

Rinse and repeat.