jake_the_fake1 said:
... Also I'd like to point out that neither the high end GPUs from Nvidia or AMD use EDram in their graphics card, in fact the stock GTX680 has a bandwidth of 192.2GB/s, while the Titan has 288.4GB/s, graphical beasts of their time, just a little perspective. .. I'd like to also emphasise that Cerny's approach was developer centric, and as he said he wanted to remove any stumbling blocks, and split memory set ups comprising of small fast ram and large slow ram makes developers life hard, they would rather have 1 large and fast pool of ram, and Sony have done this, hence Cerny's decision to have 8GB GDDR5 ram with 176GB/s of bandwidth, best of both worlds.
|
There is no chitchat neccessary about it: eDRAM gives a hugh advantance in power for a console and even the lead PS4 developer confessed that, but they decided not to use eDRAM on PS4 because programming would be more complicated for average programmer.
eDRAM is not so efficient on PCs, because to really utilize its possibilities it must be really used by the program code, but no PC programmer will assume that there is eDRAM on a graphics card. But in the future some high end graphics card will feature eDRAM and special logics will automatically use this eDRAM even if it is not in the program code and so will give a performance boost. Of course this boost could be higher if the program (game) is coded to use eDRAM.