By using this site, you agree to our Privacy Policy and our Terms of Use. Close
z101 said:
jake_the_fake1 said:

On the second part, you do realise that the PS4 has the same setup but it's intergrated into a single chip, plus both the CPU and GPU have access to 8GB of high bandwidth ram making EDram a non-requirment,


The Wii U eDRAM bandwith is much faster than normal RAM the PS4 uses. Interesting statement from the lead system architect from the PS4: 

For example, if we use eDRAM (on-chip DRAM) for the main memory in addition to the external memory, the memory bandwidth will be several terabytes per second. It will be a big advance in terms of performance.

He even explain why the PS4 don't use eDRAM:

However, in that case, we will make developers solve a puzzle, 'To realize the fastest operation, what data should be stored in which of the memories, the low-capacity eDRAM or the high-capacity external memory?' We wanted to avoid such a situation. We put the highest priority on allowing developers to spend their time creating values for their games.

Sony don't use eDRAM because they wanted to make console that is very easy to handle even for dumb programmers so they sacrifice performance for easy programming, the other reason is that eDRAM is very expensive.

Source: http://techon.nikkeibp.co.jp/english/NEWS_EN/20130401/274313/?P=2

How do you know that the WiiU's EDRAM bandwidth is faster than the PS4 ram?

Nintendo haven't revealed it's EDRAMS bandwidth, all we really know is that it's 32MB, also the EDRAM is there in the first place to mitigate the very slow DD3 ram it's currently using, just how it was on the 360. You simply can't assume that because Cerny mentioned that he could have used EDRAM with 'several terabytes per second' but choose not to that it somehow means that the WiiU EDRAM is that fast...keep in mind EDRAM at large capacities is expensive, and Nintnedo doesn't do expensive, it would be more fair to say that Nintendo would have chosen a lower specced EDRAM that me their requirement’s, just as they've done with the rest of the machines specs.

Keep in mind that even the 360 EDram had a bandwidth of 256GB/s only to it's FPU allowing it to take the brunt of bandwidth intensive operations like AA, however to the rest of the system the bandwidth was just 32GB/s...Microsoft used some smart words for their console war cannons, which is why if you look at the Xboxone Microsoft has an ESram of 102GB/s which makes sense since it's a bandwidth for the whole system and not for just 1 aspect of a component. So with what I just mentioned, Nintendo could have easily taken a xbox360 approach in it's hardware guts but of course tinkered with it a little.

http://www.anandtech.com/show/1689/2

http://www.anandtech.com/show/6972/xbox-one-hardware-compared-to-playstation-4/3 (more recent comparison)

Also I'd like to point out that neither the high end GPUs from Nvidia or AMD use EDram in their graphics card, in fact the stock GTX680 has a bandwidth of 192.2GB/s, while the Titan has 288.4GB/s, graphical beasts of their time, just a little perspective.

I'd like to also emphasise that Cerny's approach was developer centric, and as he said he wanted to remove any stumbling blocks, and split memory set ups comprising of small fast ram and large slow ram makes developers life hard, they would rather have 1 large and fast pool of ram, and Sony have done this, hence Cerny's decision to have 8GB GDDR5 ram with 176GB/s of bandwidth, best of both worlds.