mrstickball said:
I think its viable. The Xbox went from 64MB to 512MB in just 4 years (8-fold increase). We're at year 5.5 and climbing for that system. For the PS2/3, it went from 32MB to 512MB in about 6.5 years (16-fold increase). We have to take in the following considerations:
Given the time frames, and assuming a late 2013 launch for MS/Sony devices, we're looking at about a 7 year gap between hardware, which is unheard of for consoles. The only comparison we can make would be for handheld devices, such as the DS-3DS gap of 6.5 years (Nov 2004 to Mar 2011). In this case, we saw RAM expand from 4MB to 128MB, or a 32 fold increase in RAM. Of course, the price did increase, which is something to take into consideration. Therefore, given the comparables, we *should* be looking at a 16 to 24 fold increase in RAM, even including cost-cutting measures as Sony and MS likely attempt to remove their loss-leading ways. That would put RAM between 8GB and 12GB in a 6.5 to 7 year development cycle between their earlier systems. |
Actually the original Xbox had 128 mbit chips whereas the Xbox 360 has 512 mbit chips. The increase in density in four years is only four fold which makes sense as transistors had roughly doubled every two years or thereabouts. Microsoft would have to continue using a more expensive, 8 chip configuration for memory and use 8 gigabit chips at that. You'd have to hope that 8 gigabit chips are even economical to produce by the time the next generation comes around, especially high speed ones at that which tend to be considerably more expensive to produce. You'd also have to hope that Microsoft/Sony would be willing to use 8 of them... With each node becoming more expensive it may not be feasable to expect to see RAM produced on a cutting edge process node anymore.
Tease.








