z101 said:
But eDRAM is best when the programmer know it is there and can fully utilize it. Use of eDRAM in PCs difficult. Even when a developer want to use it they must make code for PCs with or without eDRAM. But eDRAM is perfect for consoles with their standardized hardware. Modern graphics card use eDRAM to: http://www.theinquirer.net/inquirer/news/2265428/intel-haswell-chips-will-have-onpackage-dram-for-cpu-and-gpu Even a Sony Lead system architect of the PS4 admits that eDRAM gives a performance boost: http://techon.nikkeibp.co.jp/english/NEWS_EN/20130401/274313/?P=2 |
Whilst what you say is true (about the programming knowing what's available), it's also almost irrelevant to the majority of developers in this day and age where most games are multiplatform. Most developers don't have enough time to spend optimising their software for only one platform and the result will be most 3rd-party devs doing the bare minimum to get the game to run smoothly.
Look at this gen, most developers used the 360 eDRAM for a small subset of tasks. It was worse for the PS3 with its Cell processor, split RAM and GPU with fixed pixel and vertex shaders. It was such a nightmare for most devs that a number of 3rd party titles look and perform worse compared to their 360 equivalent. Early games didn't even make use of the SPEs in the Cell processor.
It's only the exclusive devs that really have the time to optimise their code thoroughly which only really comes with working on a single platform.