By using this site, you agree to our Privacy Policy and our Terms of Use. Close
OdinHades said:
Screenshot said:

16 gig is more than enough for 4k even on pc. 

16 GB system memory, yes. But if we're talking about shared memory that would be something like 8 GB system memory + 8 GB VRAM. That wouldn't be enough for gaming in 4K for the next 6 years or so.

Errrrr..... no. Just no. What in hell are you doing with 8GB of RAM for the CPU? CPU code just doesn't require that much RAM. The reason the GPU ends up taking as much RAM as it does is because of textures and stuff like that. Basically images.

Bofferbrauer2 said:

AMD got forced to buy a specific amount of wafers from GF per year, what the reasons for that are is irrelevant, they had to pay even if AMD didn't need or couldn't sell any more chips from any additional wafers.

And yeah, they got bargain prices. AMD got around 100$ for the OG PS4 chip (the article write must think AMD produces hem out of thin air at no cost), and that's also what the chip did more or less cost in production. So not, AMD did not gain much from it, but was good enough to stay afloat.

Besides, NVidia said themselves they weren't interested because the margins were way too small. Like the article details, NVidia made about 10$ per PS4, which isn't very much and barely worth the work. The X1 was readily available and in surplus, so they had nothing to loose from the Nintendo deal as they needed to do zero work for it. But they stated themselves they won't do custom chips anymore, which also limits any upgrade to the Switch to an X2 unless they change their mind on it.

What are you saying bruh? Like do you understand how chip pricing works at all??? 

AMD gettig $100 for ach chip they give to sony isn't them selling them at bargain prices at all. Thats them selling them at "bulk/OEM" pricing which is totally normal when any company puts in orders in the region of millions. 

Take the 3600G or instance, say AMD sells that at retail for $220, that pansout like this... the actual cost of making each of those chips (what AMD pays to the foundry) is like $30/$40. Then AMD will add their markup to account for things like yields, profits, packaging and shipping..etc. At this point the chip comes up to around $170. Then they put their MSRP sticker price of $220 so te retailers make their own ut too.

If tht chip was going into a console, first off the console manufacturer will pay a sizeable sum to "customize" their chip. This reduces how much AMD spends on R&D for that chip and nothing stops them from taking elements of that chips design into their general product line. Then AMD is not worrying about costs like packaging, shiping, marketing and there isn't a retailer cut either. AMD also isn't worrying about yields as that will be something sony/ms absorbs. 

So selling each chip for $100 wilbe them making a good deal amount of money.

I don't even get how any of this is relevant..... are you saying that AMD is somehow not going to be selling chips at that prices anymore because they are doing well now? Well if that is what you are saying then you are just wrong. There is a reason why even Apple only puts AMD GPUs in their computers. And Nvidia is just nonsense with regards to the kinda hardware that works for consoles. Not only are they resistant to drop prices, they also just don't make APUs (that aren't ARM based). So sony/ms using them will mean they 'must" build a discrete cpu/gpu system.