Wow found an excellent post by Timothy Lottes, the chap who invented FXAA anti-aliasing at Nvidia, regarding the issues of using slower DDR3 in consoles for the GPU and trying to make up for it with eDRAM, instead of going full bus width and GDDR5 setup as used in traditional GPUs (and rumoured for PS4). I said earlier you cannot overcome memory bandwidth limitations to the GPU if you go with DDR3 + eDRAM route because if you could, GPU makers would be doing this. Lottes explains it on a technical level:
Lottes seems fearful of Microsoft using a large amount of DDR3 memory because it might pose limits on memory bandwidth. On this issue he says:
“On this platform I’d be concerned with memory bandwidth. Only DDR3 for system/GPU memory pared with 32MB of “ESRAM” sounds troubling. 32MB of ESRAM is only really enough to do forward shading with MSAA using only 32-bits/pixel color with 2xMSAA at 1080p or 4xMSAA at 720p. Anything else to ESRAM would require tiling and resolves like on the Xbox360 (which would likely be a DMA copy on 720) or attempting to use the slow DDR3 as a render target.
I’d bet most titles attempting deferred shading will be stuck at 720p with only poor post process AA (like FXAA). If this GPU is pre-GCN with a serious performance gap to PS4, then this next Xbox will act like a boat anchor, dragging down the min-spec target for cross-platform next-generation games.”
Lottes seems to be less concerned about the rumored specs of the next Playstation. Most rumors have been pointing at the PS4 having less RAM than the next Xbox (4GB vs 8GB) but that may not matter if Sony uses DDR5 memory instead of the type of memory Microsoft is rumored to be considering. It would produce better results with a better amount of memory bandwidth.
Lottes says:
“If PS4 has a real-time OS, with a libGCM style low level access to the GPU, then the PS4 1st party games will be years ahead of the PC simply because it opens up what is possible on the GPU. Note this won’t happen right away on launch, but once developers tool up for the platform, this will be the case.
As a PC guy who knows hardware to the metal, I spend most of my days in frustration knowing damn well what I could do with the hardware, but what I cannot do because Microsoft and IHVs wont provide low-level GPU access in PC APIs. One simple example, drawcalls on PC have easily 10x to 100x the overhead of a console with a libGCM style API.”
Looks like Sony is doing all the right things to fix all the major issues of PS3 because I already linked other sources saying PS4 will allow libGCM style access to the metal of its hardware!