By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Bofferbrauer2 said:

@bolded: We don't even know if that's a true chip (and at 20CU, I really doubt it, especially considering it will be totally bandwith starved even with DDR4 4000). But I digress.

But we know the vega based 2400G exists. We also know that AMD always makes a few APUs with every product series. And those things aren't designed to be graphical powerhouses anyways so RAM bottleneck is a moot point.

Bofferbrauer2 said: 

The actual cost depends how much AMD has to pay per wafer, divided by how many chips on that wafer are salvageable for that purpose. So let's say a wafer cost 1000$ (I'm just making up a price here), 20 such chips would fit on it but only 10 would be fully functioning, the others would have to be sold as either 3400G or binned entirely due to defects. In this case AMD would certainly charge at least 100$ on the 3600G to cover the costs already, and use the 3400G for winnings.

However, on a console that's not possible, hence why the PS4 has 2 deactivated CU to improve the yield rate.

Ok while I know you are just making examples, lets try and make it more accurate. The size of wafer commonly used for AMD and in turn sony/ms is a 300mm diameter wafer. If each die is around 350mm2 you can get around 200 chips/wafer. Now each wafer costs anywhere between $300 and $10000 on the foundry side of things depending on number of processing steps and complexity.

AMD will pay the agreed amount for every wafer..... regardless of what is working or not working in it as long as an agreed upon  minimum chip yield per wafer is met. So say the ost of this wafer is $10,000 (and this is not how much a console APU wafer will cost). AMD has at this point spent $50/chip. Assuming every single one of the chips work. Then now they are about to sell it to sony. They know sony wants to be able to hit x and y clock speeds, 20 chips are off the table. Then they find out that of the 180 left 20 are defective. They are now left with 160. So the cost of each chip for them is $62. Then they sell it to sony/ms for $100+.

As per yield rates, sony/ms after having agreed on their processor design will know that that processor will cost them a fortune if they want everything to be perfect. Opting to deactivate CUsi part of the pricing process.

Bofferbrauer2 said: 

@underlined: Well, no, I'm not saying that they won't do it anymore, but rather that they are not obliged to do so anymore to have any sizable income at all.

At the time when the PS4/XBO came out, AMD CPUs were doing very badly and were the laughingstock of the industry. They just released Hawaii, but had much problems keeping up with NVidias updated Kepler (GeForce 700 series), so earnings were breaking away left and right and could only really compete over the price. As a result their profit margin plummeted, and still is awfully low for the sector (it's under 50% while Intel and NVidia are close to or above 70%; at the time it even dropped below 30%, which is bad in any sector). All this made that AMD was desperate for some stable income, which made Sony and Microsoft holding all the cards during the price negotiations. But that won't be the case this time, and AMD will squeeze some winnings out of the chips.

Its not an "obligation"..... its just simple business practice. Your issue here is that you seem  to think that the way the console manufacturers are paying for their chips is some sort of give away price...... its not. Its the kinda price you get when you are dealing with a company and a promise to by upwards of 50M of something from day one. Console OEM pricing is not remotely indicative of what retail pricing will be or what profit margins for direct sales or to lower tier OEMs (smaller volumes) will be. Not just for chips but for ever single component that goes into the console. A good way to look at it is that whatever the cost of that component is at retail, the console OEM will be paying less than half that amount.

And you have got this backwards, AMD even needs their money less now than it ever has. So its more likely to work with them for what is more or less a licensed chip than to tr and milk them for anything.

Bofferbrauer2 said: 

Also, as a side note, you give the costs at 30-40$. Tell me how that works if about half of the sales are from console chips (which was true in 2016) yet the profit margin is at only 24%? Do you think AMD sold their other chips all below production price? And how could that be, considering most chips cost much more than the one in the PS4? Or do you think they had such an R&D expense that it covers half the expenses before wages and taxes? Just saying that your price is off, it may be well below 100$ by then, but I don't think anywhere close to the numbers you're putting there, more like 60-80$. Don't forget that 350mm2 ain't exactly a small chip (a 10 core Skylake-X is only 322mm2, for instance) and that such a big chip normally sells at quite some higher prices for reasons detailed above.

Again you are going about this wrong...... yes back in 2016 growth from semi custom sector (which is probably 90% consoles) equated to about half of AMDs quarterly revenue in certain quarters. That's total revenue not profit margins. There is a very big difference. eg in a particular quarter revenue in that division was at $590M (real number in their 2nd quarter 2016). Now if in that quarter alone they took in orders of say 5M chips and got around $100 for each one what does that give you? Yup.... around $500M.

Still hard to understand?

Last edited by Intrinsic - on 18 March 2019