AMD got forced to buy a specific amount of wafers from GF per year, what the reasons for that are is irrelevant, they had to pay even if AMD didn't need or couldn't sell any more chips from any additional wafers.
And yeah, they got bargain prices. AMD got around 100$ for the OG PS4 chip (the article write must think AMD produces hem out of thin air at no cost), and that's also what the chip did more or less cost in production. So not, AMD did not gain much from it, but was good enough to stay afloat.
Besides, NVidia said themselves they weren't interested because the margins were way too small. Like the article details, NVidia made about 10$ per PS4, which isn't very much and barely worth the work. The X1 was readily available and in surplus, so they had nothing to loose from the Nintendo deal as they needed to do zero work for it. But they stated themselves they won't do custom chips anymore, which also limits any upgrade to the Switch to an X2 unless they change their mind on it.
What are you saying bruh? Like do you understand how chip pricing works at all???
AMD gettig $100 for ach chip they give to sony isn't them selling them at bargain prices at all. Thats them selling them at "bulk/OEM" pricing which is totally normal when any company puts in orders in the region of millions.
Take the 3600G or instance, say AMD sells that at retail for $220, that pansout like this... the actual cost of making each of those chips (what AMD pays to the foundry) is like $30/$40. Then AMD will add their markup to account for things like yields, profits, packaging and shipping..etc. At this point the chip comes up to around $170. Then they put their MSRP sticker price of $220 so te retailers make their own ut too.
If tht chip was going into a console, first off the console manufacturer will pay a sizeable sum to "customize" their chip. This reduces how much AMD spends on R&D for that chip and nothing stops them from taking elements of that chips design into their general product line. Then AMD is not worrying about costs like packaging, shiping, marketing and there isn't a retailer cut either. AMD also isn't worrying about yields as that will be something sony/ms absorbs.
So selling each chip for $100 wilbe them making a good deal amount of money.
I don't even get how any of this is relevant..... are you saying that AMD is somehow not going to be selling chips at that prices anymore because they are doing well now? Well if that is what you are saying then you are just wrong. There is a reason why even Apple only puts AMD GPUs in their computers. And Nvidia is just nonsense with regards to the kinda hardware that works for consoles. Not only are they resistant to drop prices, they also just don't make APUs (that aren't ARM based). So sony/ms using them will mean they 'must" build a discrete cpu/gpu system.
@bolded: We don't even know if that's a true chip (and at 20CU, I really doubt it, especially considering it will be totally bandwith starved even with DDR4 4000). But I digress.
The actual cost depends how much AMD has to pay per wafer, divided by how many chips on that wafer are salvageable for that purpose. So let's say a wafer cost 1000$ (I'm just making up a price here), 20 such chips would fit on it but only 10 would be fully functioning, the others would have to be sold as either 3400G or binned entirely due to defects. In this case AMD would certainly charge at least 100$ on the 3600G to cover the costs already, and use the 3400G for winnings.
However, on a console that's not possible, hence why the PS4 has 2 deactivated CU to improve the yield rate.
@italic: These costs are not always covered, I can remember that the cost of some chips were actually worked into the yearly contracts instead of receiving a sum early on. And considering AMD didn't seem to have gotten any Lump sum (if they did, it doesn't show up in the financial reports at least), I do think they have to cover for those expenses with the chip sales.
@underlined: Well, no, I'm not saying that they won't do it anymore, but rather that they are not obliged to do so anymore to have any sizable income at all.
At the time when the PS4/XBO came out, AMD CPUs were doing very badly and were the laughingstock of the industry. They just released Hawaii, but had much problems keeping up with NVidias updated Kepler (GeForce 700 series), so earnings were breaking away left and right and could only really compete over the price. As a result their profit margin plummeted, and still is awfully low for the sector (it's under 50% while Intel and NVidia are close to or above 70%; at the time it even dropped below 30%, which is bad in any sector). All this made that AMD was desperate for some stable income, which made Sony and Microsoft holding all the cards during the price negotiations. But that won't be the case this time, and AMD will squeeze some winnings out of the chips.
Also, as a side note, you give the costs at 30-40$. Tell me how that works if about half of the sales are from console chips (which was true in 2016) yet the profit margin is at only 24%? Do you think AMD sold their other chips all below production price? And how could that be, considering most chips cost much more than the one in the PS4? Or do you think they had such an R&D expense that it covers half the expenses before wages and taxes? Just saying that your price is off, it may be well below 100$ by then, but I don't think anywhere close to the numbers you're putting there, more like 60-80$. Don't forget that 350mm2 ain't exactly a small chip (a 10 core Skylake-X is only 322mm2, for instance) and that such a big chip normally sells at quite some higher prices for reasons detailed above.
Your Apple example is a bit special, they use it due to OGL and OCL capabilities, where NVidia is weaker than AMD and generally has been like that. Them being cheaper than NVidia is only icing on the cake. But that's going to change soon anyway, considering that Apple wants to design all their chips in-house and are migrating everything to ARM.