By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Wiiu RAM 43% slower than PS360 RAM

goddog said:


the memory wont be a major issue unless this is what the video card is using ... then id be like what the hell big N, but i think the video card uses edram on the gpu die...

Huh? 2G 64bit wide memory, that's the memory there is for every one (CPU,GPU,DSP) to fight for. eDram is a (complex) frame buffer, nothing more, nothing less.



Around the Network

So in Dragon Ball terms, how the PS3/360/WiiU Stand?



drkohler said:
blackstarr said:
Can somebody please break this down into non-tech lingo for plebians like me?
Is it looking like the gap between Wii U and the other next gen consoles is going to be irreconcilably big?

It basically means that the WiiU has a $4 memory system that was outdated in 2006. Give the nextgen consoles 4/8G of 128bit memory (256bit probably would be a stretch in a console), and developers will have a very hard time matching the WiiU performance to that. Does not mean that WiiU ports will look significantly worse than the nextgen versions, but it becomes more likely, though.

More accurately, the Wii U may be using slow memory because the architecture negates the need for faster memory ... Of course, waiting to have a complete understanding of the architecture before you trash it would be completely unreasonable.

Edit: essentially, by caching data in edram rather than grabbing it from main memory, Nintendo could be eliminating 50% to 95% of the data transfer across the memory bus making the difference between a $4 memory module and a $400 memory module meaningless.



The memory is 1066 MHz max as per Samsungs spec sheet btw, so 8.5 GB/s bandwidth.... That's only 2 times faster than Wii's memory.

Seriously considering cancelling my pre-order.



Baron said:
The memory is 1066 MHz max as per Samsungs spec sheet btw, so 8.5 GB/s bandwidth.... That's only 2 times faster than Wii's memory.

Seriously considering cancelling my pre-order.

Did you only buy a Wii U because of its graphics or something?



Around the Network

Muahahahahahahaha!!!! XD



blackstarr said:
Baron said:
The memory is 1066 MHz max as per Samsungs spec sheet btw, so 8.5 GB/s bandwidth.... That's only 2 times faster than Wii's memory.

Seriously considering cancelling my pre-order.

Did you only buy a Wii U because of its graphics or something?


Of course not. In terms of graphics all consoles are obsolete as my PC, as it is now, will undoubtedly smash the next XBox and PS4 as well in that departement. I was, however, looking forward to playing a wide variety of 1st party AND 3rd party games on the Wii U. As opposed to my Wii which is only occasionally turned on for Mario, Zelda, Smash Bros and the like.

With this kind of memory I very much doubt 3rd parties will even bother reworking their entire games architecture just to accomodate to the Wii U's memory architecture. The Wii U could be fine if the next XBox and PS4 use similar (albeit more) memory but you and I both now they'll probably feature much faster video RAM that'll make it a lot easier for dev's. So I won't be surprised if the Wii U ends up with the shit ports and spinoffs that plaged the Wii.

I don't know if I want to buy a console just for it's 1st party games anymore.



HappySqurriel said:

Edit: essentially, by caching data in edram rather than grabbing it from main memory, Nintendo could be eliminating 50% to 95% of the data transfer across the memory bus making the difference between a $4 memory module and a $400 memory module meaningless.

Dude, the eDram is a FRAME BUFFER, not a cache. The CPU probably doesn't even see it....

And no, when three processors (CPU, GPU and DSP) fight for ram over a single 64bit bus, then there is bus contention issus, all the time.



Baron said:
blackstarr said:
Baron said:
The memory is 1066 MHz max as per Samsungs spec sheet btw, so 8.5 GB/s bandwidth.... That's only 2 times faster than Wii's memory.

Seriously considering cancelling my pre-order.

Did you only buy a Wii U because of its graphics or something?


Of course not. In terms of graphics all consoles are obsolete as my PC, as it is now, will undoubtedly smash the next XBox and PS4 as well in that departement. I was, however, looking forward to playing a wide variety of 1st party AND 3rd party games on the Wii U. As opposed to my Wii which is only occasionally turned on for Mario, Zelda, Smash Bros and the like.

With this kind of memory I very much doubt 3rd parties will even bother reworking their entire games architecture just to accomodate to the Wii U's memory architecture. The Wii U could be fine if the next XBox and PS4 use similar (albeit more) memory but you and I both now they'll probably feature much faster video RAM that'll make it a lot easier for dev's. So I won't be surprised if the Wii U ends up with the shit ports and spinoffs that plaged the Wii.

I don't know if I want to buy a console just for it's 1st party games anymore.

Fair enough. I'm thinking that E3 2013 is really going to be a huge pivotal moment for Nintendo. Between now and then, we will have several more quality games on Wii U released - Rayman Legends, Wonderful 101, Pikmin 3 (can't wait sigh).. And E3 2013, we will probably see an announcement from Microsoft about their new console, so we will really see if Wii U can hold up with that announcement.
Not to mention if Nintendo has some great games announced for Wii U at E3 2013, it'll give some points to Nintendo. (SSB U? Zelda or Mario? Whatever Retro is working on?) 



drkohler said:
goddog said:


the memory wont be a major issue unless this is what the video card is using ... then id be like what the hell big N, but i think the video card uses edram on the gpu die...

Huh? 2G 64bit wide memory, that's the memory there is for every one (CPU,GPU,DSP) to fight for. eDram is a (complex) frame buffer, nothing more, nothing less.


i thought the edram on die was being used for the gpu and was substantially larger than a  normal edram cache, is that not the case?  well damn thats a step back if so… there is no reason to use ddr3 to feed a gpu these days its just a bad idea  



come play minecraft @  mcg.hansrotech.com

minecraft name: hansrotec

XBL name: Goddog