Bofferbrauer2 said:
@bolded: We don't even know if that's a true chip (and at 20CU, I really doubt it, especially considering it will be totally bandwith starved even with DDR4 4000). But I digress. The actual cost depends how much AMD has to pay per wafer, divided by how many chips on that wafer are salvageable for that purpose. So let's say a wafer cost 1000$ (I'm just making up a price here), 20 such chips would fit on it but only 10 would be fully functioning, the others would have to be sold as either 3400G or binned entirely due to defects. In this case AMD would certainly charge at least 100$ on the 3600G to cover the costs already, and use the 3400G for winnings. However, on a console that's not possible, hence why the PS4 has 2 deactivated CU to improve the yield rate. @italic: These costs are not always covered, I can remember that the cost of some chips were actually worked into the yearly contracts instead of receiving a sum early on. And considering AMD didn't seem to have gotten any Lump sum (if they did, it doesn't show up in the financial reports at least), I do think they have to cover for those expenses with the chip sales. @underlined: Well, no, I'm not saying that they won't do it anymore, but rather that they are not obliged to do so anymore to have any sizable income at all. At the time when the PS4/XBO came out, AMD CPUs were doing very badly and were the laughingstock of the industry. They just released Hawaii, but had much problems keeping up with NVidias updated Kepler (GeForce 700 series), so earnings were breaking away left and right and could only really compete over the price. As a result their profit margin plummeted, and still is awfully low for the sector (it's under 50% while Intel and NVidia are close to or above 70%; at the time it even dropped below 30%, which is bad in any sector). All this made that AMD was desperate for some stable income, which made Sony and Microsoft holding all the cards during the price negotiations. But that won't be the case this time, and AMD will squeeze some winnings out of the chips. Also, as a side note, you give the costs at 30-40$. Tell me how that works if about half of the sales are from console chips (which was true in 2016) yet the profit margin is at only 24%? Do you think AMD sold their other chips all below production price? And how could that be, considering most chips cost much more than the one in the PS4? Or do you think they had such an R&D expense that it covers half the expenses before wages and taxes? Just saying that your price is off, it may be well below 100$ by then, but I don't think anywhere close to the numbers you're putting there, more like 60-80$. Don't forget that 350mm2 ain't exactly a small chip (a 10 core Skylake-X is only 322mm2, for instance) and that such a big chip normally sells at quite some higher prices for reasons detailed above. Your Apple example is a bit special, they use it due to OGL and OCL capabilities, where NVidia is weaker than AMD and generally has been like that. Them being cheaper than NVidia is only icing on the cake. But that's going to change soon anyway, considering that Apple wants to design all their chips in-house and are migrating everything to ARM. |
https://www.macrotrends.net/stocks/charts/AMD/amd/profit-margins
https://ycharts.com/companies/NVDA/profit_margin
https://ycharts.com/companies/INTC/gross_profit_margin
70% profit margin isn't something common at all.
Mummelmann said:
Perhaps, but developers do, and aiming for higher fps means better margins. As it stands, a lot of games with 30 fps aim dip into slideshow territory, games like AC: Odyssey are a good example, I don't think I've ever seen worse frame rates in a modern AAA title. With more effects and higher resolutions, severe drops in frame rate become even more jarring, a more advanced and crisp video or render stands out all the more when it slows down. Xbox One X doesn't provide proper 4K, it uses checkerboard rendering and rather few effects, frame rates on many titles are very low, Destiny 2 runs at a somewhat unstable 30 fps. Half-assed resolutions with poor performance is not what developers want to work with, it's better to find an entry point where they can get both visual fidelity and performance and not be forced to choose. And if the hardware ends up too costly, developers and publishers have a smaller market to sell their software to. I'd much rather the standard be a stable 1440p with full effects and shading than stripped down or faked 4K, then perhaps release a more expensive version of the console that does more or less what the Pro and One X do to the line-up right now. The fact that 30 fps is still more or less industry standard in console gaming in 2019 is downright shameful, especially since games often dip well below. |
Which console developer is aiming for 60fps besides FPS, racing, fighting and competitive multiplayer?
Most of console games are 30fps for most of the time and this doesn't seem like it will change.
There is nothing shameful in 30fps standard. Most console gamers have accepted/expected/preferred 30fps with higher IQ than 60fps having to sacrifice everything else to about half.
duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"
http://gamrconnect.vgchartz.com/post.php?id=8808363
Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"
http://gamrconnect.vgchartz.com/post.php?id=9008994
Azzanation: "PS5 wouldn't sold out at launch without scalpers."