By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Bofferbrauer2 said:
Intrinsic said:

What are you saying bruh? Like do you understand how chip pricing works at all??? 

AMD gettig $100 for ach chip they give to sony isn't them selling them at bargain prices at all. Thats them selling them at "bulk/OEM" pricing which is totally normal when any company puts in orders in the region of millions. 

Take the 3600G or instance, say AMD sells that at retail for $220, that pansout like this... the actual cost of making each of those chips (what AMD pays to the foundry) is like $30/$40. Then AMD will add their markup to account for things like yields, profits, packaging and shipping..etc. At this point the chip comes up to around $170. Then they put their MSRP sticker price of $220 so te retailers make their own ut too.

If tht chip was going into a console, first off the console manufacturer will pay a sizeable sum to "customize" their chip. This reduces how much AMD spends on R&D for that chip and nothing stops them from taking elements of that chips design into their general product line. Then AMD is not worrying about costs like packaging, shiping, marketing and there isn't a retailer cut either. AMD also isn't worrying about yields as that will be something sony/ms absorbs. 

So selling each chip for $100 wilbe them making a good deal amount of money.

I don't even get how any of this is relevant..... are you saying that AMD is somehow not going to be selling chips at that prices anymore because they are doing well now? Well if that is what you are saying then you are just wrong. There is a reason why even Apple only puts AMD GPUs in their computers. And Nvidia is just nonsense with regards to the kinda hardware that works for consoles. Not only are they resistant to drop prices, they also just don't make APUs (that aren't ARM based). So sony/ms using them will mean they 'must" build a discrete cpu/gpu system.

@bolded: We don't even know if that's a true chip (and at 20CU, I really doubt it, especially considering it will be totally bandwith starved even with DDR4 4000). But I digress.

The actual cost depends how much AMD has to pay per wafer, divided by how many chips on that wafer are salvageable for that purpose. So let's say a wafer cost 1000$ (I'm just making up a price here), 20 such chips would fit on it but only 10 would be fully functioning, the others would have to be sold as either 3400G or binned entirely due to defects. In this case AMD would certainly charge at least 100$ on the 3600G to cover the costs already, and use the 3400G for winnings.

However, on a console that's not possible, hence why the PS4 has 2 deactivated CU to improve the yield rate.

@italic: These costs are not always covered, I can remember that the cost of some chips were actually worked into the yearly contracts instead of receiving a sum early on. And considering AMD didn't seem to have gotten any Lump sum (if they did, it doesn't show up in the financial reports at least), I do think they have to cover for those expenses with the chip sales.

@underlined: Well, no, I'm not saying that they won't do it anymore, but rather that they are not obliged to do so anymore to have any sizable income at all.

At the time when the PS4/XBO came out, AMD CPUs were doing very badly and were the laughingstock of the industry. They just released Hawaii, but had much problems keeping up with NVidias updated Kepler (GeForce 700 series), so earnings were breaking away left and right and could only really compete over the price. As a result their profit margin plummeted, and still is awfully low for the sector (it's under 50% while Intel and NVidia are close to or above 70%; at the time it even dropped below 30%, which is bad in any sector). All this made that AMD was desperate for some stable income, which made Sony and Microsoft holding all the cards during the price negotiations. But that won't be the case this time, and AMD will squeeze some winnings out of the chips.

Also, as a side note, you give the costs at 30-40$. Tell me how that works if about half of the sales are from console chips (which was true in 2016) yet the profit margin is at only 24%? Do you think AMD sold their other chips all below production price? And how could that be, considering most chips cost much more than the one in the PS4? Or do you think they had such an R&D expense that it covers half the expenses before wages and taxes? Just saying that your price is off, it may be well below 100$ by then, but I don't think anywhere close to the numbers you're putting there, more like 60-80$. Don't forget that 350mm2 ain't exactly a small chip (a 10 core Skylake-X is only 322mm2, for instance) and that such a big chip normally sells at quite some higher prices for reasons detailed above.

Your Apple example is a bit special, they use it due to OGL and OCL capabilities, where NVidia is weaker than AMD and generally has been like that. Them being cheaper than NVidia is only icing on the cake. But that's going to change soon anyway, considering that Apple wants to design all their chips in-house and are migrating everything to ARM.

https://www.macrotrends.net/stocks/charts/AMD/amd/profit-margins

https://ycharts.com/companies/NVDA/profit_margin

https://ycharts.com/companies/INTC/gross_profit_margin

70% profit margin isn't something common at all.

 

Mummelmann said:
DonFerrari said:

except 60fps is hardly something console gaming requires or cares for most genres.

Perhaps, but developers do, and aiming for higher fps means better margins. As it stands, a lot of games with 30 fps aim dip into slideshow territory, games like AC: Odyssey are a good example, I don't think I've ever seen worse frame rates in a modern AAA title. With more effects and higher resolutions, severe drops in frame rate become even more jarring, a more advanced and crisp video or render stands out all the more when it slows down. 

Xbox One X doesn't provide proper 4K, it uses checkerboard rendering and rather few effects, frame rates on many titles are very low, Destiny 2 runs at a somewhat unstable 30 fps.

Half-assed resolutions with poor performance is not what developers want to work with, it's better to find an entry point where they can get both visual fidelity and performance and not be forced to choose. And if the hardware ends up too costly, developers and publishers have a smaller market to sell their software to. I'd much rather the standard be a stable 1440p with full effects and shading than stripped down or faked 4K, then perhaps release a more expensive version of the console that does more or less what the Pro and One X do to the line-up right now.

The fact that 30 fps is still more or less industry standard in console gaming in 2019 is downright shameful, especially since games often dip well below.

Which console developer is aiming for 60fps besides FPS, racing, fighting and competitive multiplayer?

Most of console games are 30fps for most of the time and this doesn't seem like it will change.

There is nothing shameful in 30fps standard. Most console gamers have accepted/expected/preferred 30fps with higher IQ than 60fps having to sacrifice everything else to about half.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."