By using this site, you agree to our Privacy Policy and our Terms of Use. Close
sergiodaly said:
question @ALL
if the GPUs on PS4 and nextbox are from AMD... what makes people believe that xbox will have edram or esram and PS4 will not... if wii U uses it, its obvious AMD thinks its a good idea for the GPU and i believe AMD will use it too in Sony machine. Right?

In the Wii U, "there are four 4Gb (512MB) Hynix DDR3-1600 devices surrounding the Wii U's MCM (Multi Chip Module). Memory is shared between the CPU and GPU, and if I'm decoding the DRAM part numbers correctly it looks like these are 16-bit devices giving the Wii U a total of 12.8GB/s of peak memory bandwidth." (http://www.anandtech.com/show/6465/nintendo-wii-u-teardown)

Because MS crippled the memory bandwidth of R500 GPU in Xbox 360 in half and Wii U's GPU's memory bandwidth is also crippled by shared DDR3-1600, instead of dedicated GDDR5, both of those solutions try to mask the memory bandwidth penalty by including eDRAM. If you want to retain the full power of the GPU, you would go with GDDR5 for the GPU and drop the eDRAM. If you want to save costs, you go with cheaper DDR memory and eDRAM. If PS4 uses dedicated GDDR5 for the GPU, it's actually the most optimal approach (which is why no AMD/NV GPUs have eDRAM on the PCB). Despite sounding fancy, the use of eDRAM/eSRAM is actually a cost-savings solution to minimize the performance penalty of foregoing a wider memory bus+GDDR5. The lack of eDRAM on PS4 but the inclusion of GDDR5 would actually be a good thing, not a disadvantage. It would imply the graphics sub-system would not be compromised. If Sony cripples the 256-bit bus of HD7970M in half, then sure eDRAM is possible.