By using this site, you agree to our Privacy Policy and our Terms of Use. Close
z101 said:
jake_the_fake1 said:
z101 said:
jake_the_fake1 said:

On the second part, you do realise that the PS4 has the same setup but it's intergrated into a single chip, plus both the CPU and GPU have access to 8GB of high bandwidth ram making EDram a non-requirment,


The Wii U eDRAM bandwith is much faster than normal RAM the PS4 uses. Interesting statement from the lead system architect from the PS4: 

For example, if we use eDRAM (on-chip DRAM) for the main memory in addition to the external memory, the memory bandwidth will be several terabytes per second. It will be a big advance in terms of performance.

He even explain why the PS4 don't use eDRAM:

However, in that case, we will make developers solve a puzzle, 'To realize the fastest operation, what data should be stored in which of the memories, the low-capacity eDRAM or the high-capacity external memory?' We wanted to avoid such a situation. We put the highest priority on allowing developers to spend their time creating values for their games.

Sony don't use eDRAM because they wanted to make console that is very easy to handle even for dumb programmers so they sacrifice performance for easy programming, the other reason is that eDRAM is very expensive.

Source: http://techon.nikkeibp.co.jp/english/NEWS_EN/20130401/274313/?P=2

...

Also I'd like to point out that neither the high end GPUs from Nvidia or AMD use EDram in their graphics card, in fact the stock GTX680 has a bandwidth of 192.2GB/s, while the Titan has 288.4GB/s, graphical beasts of their time, just a little perspective. 

..

I'd like to also emphasise that Cerny's approach was developer centric, and as he said he wanted to remove any stumbling blocks, and split memory set ups comprising of small fast ram and large slow ram makes developers life hard, they would rather have 1 large and fast pool of ram, and Sony have done this, hence Cerny's decision to have 8GB GDDR5 ram with 176GB/s of bandwidth, best of both worlds.

 

 


There is no chitchat neccessary about it: eDRAM gives a hugh advantance in power for a console and even the lead PS4 developer confessed that, but they decided not to use eDRAM on PS4 because programming would be more complicated for average programmer.

eDRAM is not so efficient on PCs, because to really utilize its possibilities it must be really used by the program code, but no PC programmer will assume that there is eDRAM on a graphics card. But in the future some high end graphics card will feature eDRAM and special logics will automatically use this eDRAM even if it is not in the program code and so will give a performance boost. Of course this boost could be higher if the program (game) is coded to use eDRAM.

I agree with you, but this is why I questioned what was the bandwidth of the EDram the WiiU is using, so far we don't know, but we can't just assume the highest, we don't even know how it works in the architecture, this is why I described how the 360 uses it as to give a little perspective seeing as the WiiU and the 360 have such similarities in their desgn:-

"Keep in mind that even the 360 EDram had a bandwidth of 256GB/s only to it's FPU allowing it to take the brunt of bandwidth intensive operations like AA, however to the rest of the system the bandwidth was just 32GB/s...Microsoft used some smart words for their console war cannons, which is why if you look at the Xboxone Microsoft has an ESram of 102GB/s which makes sense since it's a bandwidth for the whole system and not for just 1 aspect of a component. So with what I just mentioned, Nintendo could have easily taken a xbox360 approach in it's hardware guts but of course tinkered with it a little."

EDram is way to expensive to have graphics card levels like 6GB, no way in hell would they even consider a split pool because developers already have to contend with 2 pools, 3 pools would be stupid hard for no reason, also GDD5 has bandwidth required to feed the shaders and it's cheaper to get the required capacities.