HoloDust said: GT post show: Pete Hines (Bethesda) - Not as big leap as PS2 to PS3 Mark Rein (Epic) - I disagree, this is spectacular |
The #1 wish from Crytek was RAM. Bethesda makes fun games but they've never made a great looking game / engine that pushes the industry. EPIC and Crytek specialize in pushing the most advanced game engines. Crysis 3 scales with 6+ CPU cores! That's just the beginning for next gen games. Skyrim can't even use more than 2 cores.
I tend to trust EPIC's opinion on this one since UE4 is very impressive in terms of pushing graphics/game engines. If you read all the interviews Crytek said RAM was very important since you can do a lot of tricks with it. Not only did Sony surprise us with 8GB of RAM, which wouldn't have been a big deal if it was $50 8GB DDR3, but they managed to include 8GB GDDR5! Just to put this in perspective, the cheapest AMD HD7970 card with 6GB of GDDR5 is $600 and GeForce Titan with 6GB is $1000. Obviously the actual GPUs on those cards are much more powerful overall but 8GB of GDDR5 is very expensive. Nvidia charges $50 just go from 2GB GDDR5 to 4GB on the GTX680. Sony went all out here because they retained 170GB/sec bandwidth for the GPU but both PS3's GPU and Xbox 360 GPU's had their memory bandwidth cut in half. In this case the 18 CU GCN gets full access of the PC version ;) On the graphics side, anything faster would have meant GTX675MX/680M/MX or HD7970M. Those are crazy expensive.
The CPU is probably going to be disappointing in the long-term if it's a Jaguar 8-core compared to Intel Haswell/AMD Piledriver 8-module ones.
I don't know how MS is going to respond to this. I think they just got caught with their pants down, probably expecting 4GB of GDDR5 maximum, thinking they would trump Sony by marketing that their system uses 8GB of system memory. It's probably unrealistic to redesign the GPU/memory subsystem of Xbox 720 at this time. Perhaps MS will use the most highest version of Jaguar stepping, 1.8-1.85Ghz instead of the rumored 1.6Ghz.