| drkohler said: Nonsense, and a myth that will be perpetuated ad infinitum it seems. First of all, mixed memory would have required an extremely awkward memory interface design in the APU, because an additional ddr3 interface would have messed up the chip layout, and increased the die size considerably. You can't just drop something like that in the last minute and think "it will work anyhow". Next, Sony knew very well that 4GBit chips would be available by the time the PS4 would go into actual mass manufacturing. Very likely not when they started the design cycle of the PS4 (although there probably was writing on the wall even back then, as it was all too logical that Samsung would be manufacturing bigger chips one day). But very likely at least a year before manufacturing of the PS4 started in earnest. The cool thing is that they actually succeeded in convincing everyone (or at least the key competitor) that the PS4 would only have 4GB - up to the point when they actually ordered the larger chips. Which, in this special case, might have been less that a year before needing them (significantly increasing the price for the memory for the initial production runs). As the new chips were drop-ins for the older chips, no redesigning whatsoever of the APU was necessary which eased the decision to go for higher price. At least one year before production starts, any console manufacturer has ordered all chips, probably with year-long contracts. (afaik, the memory for the X360 was ordered three years ahead of manufacturing). The rule is very simple: the earlier and the more you order, the cheaper it gets. And that amount/time/price curve is extremely brutal. It is completely inconceivable that Sony ordered 300M+ 2GBit chips and basically in the last seconds told Samsung "Eh we take the 4GBit chips instead" - they would have sat on those (paid) 2GBit 300M+ chips because nobody would have taken them back. |
You do know that the OG PS4 and PS4pro both use and support mixed Ram right?







