By using this site, you agree to our Privacy Policy and our Terms of Use. Close
ethomaz said:

dahuman said:

So once again, if you've been following Wii U news, the devs generally praised it based on latency, I'm hoping you understand what that means, if not, go here: http://en.wikipedia.org/wiki/CAS_latency

Just because a piece of memory has lower bandwidth, it doesn't mean it's actually slower, we don't have enough data to say if it's actually slower, it all depends on what the CAS number is, and we don't have that number, so no opinion really matters. Not to mention that we don't even know what the memory controller actually is doing in the CPU, how customized is it? Is it single channel or dual channel? Nobody knows but Nintendo.

The best case latency for Wii U memory is 5... it gives 6.25ns interval... great but not going to do miracle (even so I don't expect to Wii U use the best case... I think for mass production the 6 or 7 latency is a better choice).

So all really matter is how the 32 eDRAM can help the memory, or better, how the developers can use the eDRAM to avoid the slow main RAM... that's my point.

32MB eDRAM can help to archive 1080p but to do AA at the same time not... so for a complex game it have to be 720p.

The memory controller is a quad-channel 16bits (64bits real).


If it's really at 5ns, then depends on how the CPU is built, in theory the Wii U would still be faster in real world performance compared to PS3 and 360 on a memory level, because memory access time is the one weakness that the 360 and PS3 both had due to the way their CPUs work. Even more into the equation though, another question would be, how far away the CPU is to the RAM in cycles. It all depends on how well Nintendo has optimized this machine vs just tossing raw power bottle necks into it.

Just remember that bandwidth isn't everything, Clock speed only truly mattered in the SDR days. I'm also not too worried about AA with the advent of MLAA TBH.