By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Wiiu RAM 43% slower than PS360 RAM

JazzB1987 said:
Single channel dual channel (would double the GB/s) ? What about the edram? Talking about this is useless as long as not all info is available.

Read what I wrote above. 64bit is the industry standard for a single memory dram channel. Since WiiU has four 16bit wide ddr3 chips, this means a single memory controller. Font end to dram, we don't know what happens behind the controller. It would be foolish to use 2 channels (at 32bit wide), because that would only incease transistor costs.

Again, the edram is a FRAME BUFFER for the GPU. It requires a much bigger size than XBox because it has to simultaneously hold (at least) two 1080p frames and two tablet frames.



Around the Network
Soundwave said:
It's slower than the 360/PS3 RAM but probably covers up for it by having more eDRAM.


yes and no, the WiiU's chip is a SOC with all cpu,gpu and edram memory bandwidth all on the same die. the Speed may be slower but since everything is on the single die, than thats going to be less of a problem with connection throughput with off chip functions i/o, the panasonic  video chip etc.



I AM BOLO

100% lover "nothing else matter's" after that...

ps:

Proud psOne/2/3/p owner.  I survived Aplcalyps3 and all I got was this lousy Signature.

joeorc said:


yes and no, the WiiU's chip is a SOC with all cpu,gpu and edram memory bandwidth all on the same die.

No. the CPU is not on the same chip carrier as the GPU/eDram (as per Nintendo image of the mainboard posted 1-2 weeks ago).



AlphaCielago said:
JEMC said:

Not long ago (15 gdays ago to be more precise), Not Enough Shaders had an interview with Shin’en about WiiU where they said

The CPU and GPU are a good match. As said before, today’s hardware has bottlenecks with memory throughput when you don’t care about your coding style and data layout. This is true for any hardware and can’t be only cured by throwing more megahertz and cores on it. Fortunately Nintendo made very wise choices for cache layout, ram latency and ram size to work against these pitfalls. Also Nintendo took care that other components like the Wii U GamePad screen streaming, or the built-in camera don’t put a burden on the CPU or GPU.

and

When testing our first code on Wii U we were amazed how much we could throw at it without any slowdowns, at that time we even had zero optimizations. The performance problem of hardware nowadays is not clock speed but ram latency. Fortunately Nintendo took great efforts to ensure developers can really work around that typical bottleneck on Wii U. They put a lot of thought on how CPU, GPU, caches and memory controllers work together to amplify your code speed. For instance, with only some tiny changes we were able to optimize certain heavy load parts of the rendering pipeline to 6x of the original speed, and that was even without using any of the extra cores.

source: http://www.notenoughshaders.com/2012/11/03/shinen-mega-interview-harnessing-the-wii-u-power

 

How does that mix with this news?


Lol at bold.

lol

Thank you. Fixed.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Don't care about numbers or eDRAM or anything like that. I just want to see more comparisons with games like Batman and Black Ops 2. All the hidden technical jargon is moot imo.



Around the Network
drkohler said:
HappySqurriel said:

Edit: essentially, by caching data in edram rather than grabbing it from main memory, Nintendo could be eliminating 50% to 95% of the data transfer across the memory bus making the difference between a $4 memory module and a $400 memory module meaningless.

Dude, the eDram is a FRAME BUFFER, not a cache. The CPU probably doesn't even see it....

And no, when three processors (CPU, GPU and DSP) fight for ram over a single 64bit bus, then there is bus contention issus, all the time.


and you have documentation for this I assume?

With the Gamecube the GPU had 3MB of built in memory, 2MB of which was dedicated to frame buffers while the other 1MB was used as a texture cache; the Flipper supported (up to) 9 to 1 texture compression, which allowed (in many cases) the Gamecube to have up to (about) 3 Million texels in memory (or roughly 6 texels for every pixel they rendered).

Why would Nintendo take an approach with the Gamecube that worked so well, roughly emulate it with the Wii U, and ignore one of the biggest advantages it gave them?

 

Every post of yours generally requires the assumption that Nintendo's engineers are incompetent morons, my assumption is that their design philosophy is different than Microsoft and Sony. It is likely that they have enough edram on their CPU/GPU that, when used appropriately, far less data is being transferred across the bus to really need super fast memory; and this would likely explain earlier comments from developers praising the memory architecture, and I would expect that once you cached textures memory bandwith isn't a bottleneck on the Wii U like it is on the XBox 360 or PS3.



HappySqurriel said:
drkohler said:
HappySqurriel said:

Edit: essentially, by caching data in edram rather than grabbing it from main memory, Nintendo could be eliminating 50% to 95% of the data transfer across the memory bus making the difference between a $4 memory module and a $400 memory module meaningless.

Dude, the eDram is a FRAME BUFFER, not a cache. The CPU probably doesn't even see it....

And no, when three processors (CPU, GPU and DSP) fight for ram over a single 64bit bus, then there is bus contention issus, all the time


and you have documentation for this I assume?

Yes. Every computer ever built by any engineer. Of course Nintendo SOFTWARE engineers use caching techniques wherever possible in whatever way possible. In case you did not notice, this is a thread about HARDWARE, not software.



drkohler said:
joeorc said:


yes and no, the WiiU's chip is a SOC with all cpu,gpu and edram memory bandwidth all on the same die.

No. the CPU is not on the same chip carrier as the GPU/eDram (as per Nintendo image of the mainboard posted 1-2 weeks ago).


UMM yes it is ..IT IS INDEED A MCM

HERE YA GO

This time we fully embraced the idea of using an MCM for our gaming console. An MCM is where the aforementioned Multi-core CPU chip and the GPU chip10 are built into a single component. The GPU itself also contains quite a large on-chip memory. Due to this MCM, the package costs less and we could speed up data exchange among two LSIs while lowering power consumption. And also the international division of labor in general, would be cost-effective.

http://iwataasks.nintendo.com/interviews/#/wiiu/console/0/0

 



I AM BOLO

100% lover "nothing else matter's" after that...

ps:

Proud psOne/2/3/p owner.  I survived Aplcalyps3 and all I got was this lousy Signature.

drkohler said:
HappySqurriel said:
drkohler said:
HappySqurriel said:

Edit: essentially, by caching data in edram rather than grabbing it from main memory, Nintendo could be eliminating 50% to 95% of the data transfer across the memory bus making the difference between a $4 memory module and a $400 memory module meaningless.

Dude, the eDram is a FRAME BUFFER, not a cache. The CPU probably doesn't even see it....

And no, when three processors (CPU, GPU and DSP) fight for ram over a single 64bit bus, then there is bus contention issus, all the time


and you have documentation for this I assume?

Yes. Every computer ever built by any engineer. Of course Nintendo SOFTWARE engineers use caching techniques wherever possible in whatever way possible. In case you did not notice, this is a thread about HARDWARE, not software.

So the Gamecube and Wii didn't use 1MB of their 3MB on GPU memory as a texture cache?

In modern times hardware developers produce firmware, drivers and hardware abstraction layers over their hardware to be used by software developers to ensure that the hardware is used as efficiently as possible and to drastically simplify the process of producing hardware. OpenGL and Direct3D are the most well known hardware abstrating 3D APIs in the world, Nintendo (for obvious reasons) will develop a hardware abstraction layer compatible with OpenGL and will expose optimizations and unique hardware features using OpenGL extensions.

The fact that you don't understand this demonstrates that you don't know enough to have an informed opinion on the Wii U's hardware architecture



Heavenly_King said:
So in Dragon Ball terms, how the PS3/360/WiiU Stand?

Wii U =

 

360 = 

PS3 = 

Wii = 



Atto Suggests...:

Book - Malazan Book of the Fallen series 

Game - Metro Last Light

TV - Deadwood

Music - Forest Swords