By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Wii U's eDRAM stronger than given credit?

ICStats said:

The 360's eDRAM was theoretically fast enough to never be a bottleneck in the pipeline so even if it was 2 or 4 times faster it would not change it's performance as part of the system.

Probably for Wii U it's the same. The bottleneck on the GPU is likely in the shaders. Wii U also seems bottlenecked in it's CPU, and SIMD throughput.

The problem with 360's eDRAM was it was too small. Wii U has 3.2 times as much, so it's better equipped in that regard.

The CPU is Wii U's weak point, though I do think it's underestimated, as it's often judged by its performance on games built for PS3/360's CPUs.



Around the Network
curl-6 said:
ICStats said:

The 360's eDRAM was theoretically fast enough to never be a bottleneck in the pipeline so even if it was 2 or 4 times faster it would not change it's performance as part of the system.

Probably for Wii U it's the same. The bottleneck on the GPU is likely in the shaders. Wii U also seems bottlenecked in it's CPU, and SIMD throughput.

The problem with 360's eDRAM was it was too small. Wii U has 3.2 times as much, so it's better equipped in that regard.

The CPU is Wii U's weak point, though I do think it's underestimated, as it's often judged by its performance on games built for PS3/360's CPUs.

This argument is bogus, as PS3, 360 and Wii U all have PPC CPUs.  The games built for PS3/360 are already optimized for PPC.  Any further optimization will help PS3/360 more than Wii U.

The Wii U has ~2.5X lower clock rate and it relies on out-of-order processing to make up the difference.  This means the Wii U's CPU average performance for unoptimized code is better, but it's peak is worse than PS3/XB1.

It also means there is no secret untapped performance, no "secret sauce".  Out-of-order execution is an automatic feature of the Wii U CPU, it's always on already.  Untapped Wii U CPU potential is a lie.



My 8th gen collection

ICStats said:
curl-6 said:
ICStats said:

The 360's eDRAM was theoretically fast enough to never be a bottleneck in the pipeline so even if it was 2 or 4 times faster it would not change it's performance as part of the system.

Probably for Wii U it's the same. The bottleneck on the GPU is likely in the shaders. Wii U also seems bottlenecked in it's CPU, and SIMD throughput.

The problem with 360's eDRAM was it was too small. Wii U has 3.2 times as much, so it's better equipped in that regard.

The CPU is Wii U's weak point, though I do think it's underestimated, as it's often judged by its performance on games built for PS3/360's CPUs.

This argument is bogus, as PS3, 360 and Wii U all have PPC CPUs.  The games built for PS3/360 are already optimized for PPC.  Any further optimization will help PS3/360 more than Wii U.

The Wii U has ~2.5X lower clock rate and it relies on out-of-order processiong to make up the difference.  That also means there is no secret untapped performance, no "secret sauce".  Out-of-order execution is an automatic feature of the CPU, it's always on already.  Untapped Wii U CPU potential is a lie.

PPC's are not all the same, you know. Games optimized for Xenon and Cell would not be optimized for Espresso at all. The former are high clock, low cache, long pipeline cores. The latter is a low clock, high cache, short pipeline core with a separate audio chip and a GPGPU to help it out.



curl-6 said:
ICStats said:
curl-6 said:
ICStats said:

The 360's eDRAM was theoretically fast enough to never be a bottleneck in the pipeline so even if it was 2 or 4 times faster it would not change it's performance as part of the system.

Probably for Wii U it's the same. The bottleneck on the GPU is likely in the shaders. Wii U also seems bottlenecked in it's CPU, and SIMD throughput.

The problem with 360's eDRAM was it was too small. Wii U has 3.2 times as much, so it's better equipped in that regard.

The CPU is Wii U's weak point, though I do think it's underestimated, as it's often judged by its performance on games built for PS3/360's CPUs.

This argument is bogus, as PS3, 360 and Wii U all have PPC CPUs.  The games built for PS3/360 are already optimized for PPC.  Any further optimization will help PS3/360 more than Wii U.

The Wii U has ~2.5X lower clock rate and it relies on out-of-order processiong to make up the difference.  That also means there is no secret untapped performance, no "secret sauce".  Out-of-order execution is an automatic feature of the CPU, it's always on already.  Untapped Wii U CPU potential is a lie.

PPC's are not all the same, you know. Games optimized for Xenon and Cell would not be optimized for Espresso at all. The former are high clock, low cache, long pipeline cores. The latter is a low clock, high cache, short pipeline core with a separate audio chip and a GPGPU to help it out.


Yes, I know that.  Do you know what it means?

Shorter pipelines = less branch mis-predict cost, automatically.
Larger caches = less code & data eviction problems, automatically.
Out-of-order = higher IPC, automatically.

Did I mention "automatically" enough?  It means it's not secret sauce, or something you have to optimize for.  It automatically runs PS3/360 code more efficiently.  Games are already benefiting from that.

As for the audio chip - I would hope that it's use is just part of the SDK and so games are already using that.
As for GPGPU - Sure this is something that developers could use to offload some CPU work, but this is a different topic from untapped potential of the CPU.



My 8th gen collection

ICStats said:

Yes, I know that.  Do you know what it means?

Shorter pipelines = less branch mis-predict cost, automatically.
Larger caches = less code & data eviction problems, automatically.
Out-of-order = higher IPC, automatically.

Did I mention "automatically" enough?  It means it's not secret sauce, or something you have to optimize for.  It automatically runs PS3/360 code more efficiently.  Games are already benefiting from that.

As for the audio chip - I would hope that it's use is just part of the SDK and so games are already using that.
As for GPGPU - Sure this is something that developers could use to offload some CPU work, but this is a different topic from untapped potential of the CPU.

If your "all PPCs are the same" theory was right, then code for Xenon would automatically be optimized for the Cell. Clearly this is not the case, and it's not the case for Espresso either.



Around the Network
curl-6 said:
Pemalite said:
snowdog said:
Games such as Trine 2, Most Wanted, The Wonderful 101, Super Mario 3D World, Mario Kart 8, X, Bayonetta 2 and SSBU say you're wrong.

There have been plenty of games at 720p native, 60fps with v-synch enabled. The latter of which is the most important.

It lies somewhere between the PS3 and Xbox One in terms of power, and if Nintendo have evolved the TEV Unit the way I think they have it explains several things: 1) Latte's ALUs being twice the size they should be 2) The low power draw 3) The performance of Latte with games in 720p native, 60fps with v-synch enabled 4) Negative comments from devs who haven't actually used the thing and have just judged it by looking at the spec sheet.


It would be a big stretch to assume that the WiiU uses TEV.
And is only speculation at this point.

CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxAzyYOAQ9.99
CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxAzyYOAQ9.99
CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxA

There is no way in hell the Wii U uses a TEV unit. That's a DX7-era fixed function register combiner unit.

Devs have said Wii U has DX10/11 equivalent features, which puts it a decade ahead of that.

I didn't say it had a TEV Unit. My theory is that it has an EVOLVED TEV Unit. The TEV Unit in the Wii was the main cause of the Wii not receiving down-ports from the PS3 and 360, not the gap in power. It gave the Wii a nonstandard rendering pipeline.

An evolution of the TEV Unit would explain a lot. It would explain the low power draw, it would explain the ALUs being twice the size they should be and it would explain how a GPU with just 176 or 352GFlops can produce the visuals we've seen for Super Mario 3D World, Mario Kart 8, Bayonetta 2, X and SSBU. The first 3 are 720p native at 60fps with v-synch enabled, X is believed to be 720p native, probably 30fps with v-synch enabled and SSBU is believed to be 1080p native at 60fps.

And with regards to CPU latency it should be remembered that Expresso has a ridiculously short 4 stage pipeline compared to the PS4's CPU with 17 stages. That's why Expresso doesn't have SMT, it doesn't need it.



snowdog said:
curl-6 said:

CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxAzyYOAQ9.99
CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxAzyYOAQ9.99
CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxA

There is no way in hell the Wii U uses a TEV unit. That's a DX7-era fixed function register combiner unit.

Devs have said Wii U has DX10/11 equivalent features, which puts it a decade ahead of that.

I didn't say it had a TEV Unit. My theory is that it has an EVOLVED TEV Unit. The TEV Unit in the Wii was the main cause of the Wii not receiving down-ports from the PS3 and 360, not the gap in power. It gave the Wii a nonstandard rendering pipeline.

An evolution of the TEV Unit would explain a lot. It would explain the low power draw, it would explain the ALUs being twice the size they should be and it would explain how a GPU with just 176 or 352GFlops can produce the visuals we've seen for Super Mario 3D World, Mario Kart 8, Bayonetta 2, X and SSBU. The first 3 are 720p native at 60fps with v-synch enabled, X is believed to be 720p native, probably 30fps with v-synch enabled and SSBU is believed to be 1080p native at 60fps.

When you say an "evolved TEV" do you mean it's still a texture combiner unit (that's pretty much impossible given what we've seen it pull off already) or that it's simply a more modern fixed function unit, with DX10/11 fixed functions instead of DX7 ones?



snowdog said:

I didn't say it had a TEV Unit. My theory is that it has an EVOLVED TEV Unit. The TEV Unit in the Wii was the main cause of the Wii not receiving down-ports from the PS3 and 360, not the gap in power. It gave the Wii a nonstandard rendering pipeline.

That was only part of the problem. The gap in power and the lack of RAM also were huge obstacles for proper down-ports from the PS3 and 360.

It wasn't just one component that hindered multi-platform-ports to the Wii, it was the whole system.



curl-6 said:
snowdog said:
curl-6 said:

CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxAzyYOAQ9.99
CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxAzyYOAQ9.99
CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxA

There is no way in hell the Wii U uses a TEV unit. That's a DX7-era fixed function register combiner unit.

Devs have said Wii U has DX10/11 equivalent features, which puts it a decade ahead of that.

I didn't say it had a TEV Unit. My theory is that it has an EVOLVED TEV Unit. The TEV Unit in the Wii was the main cause of the Wii not receiving down-ports from the PS3 and 360, not the gap in power. It gave the Wii a nonstandard rendering pipeline.

An evolution of the TEV Unit would explain a lot. It would explain the low power draw, it would explain the ALUs being twice the size they should be and it would explain how a GPU with just 176 or 352GFlops can produce the visuals we've seen for Super Mario 3D World, Mario Kart 8, Bayonetta 2, X and SSBU. The first 3 are 720p native at 60fps with v-synch enabled, X is believed to be 720p native, probably 30fps with v-synch enabled and SSBU is believed to be 1080p native at 60fps.

When you say an "evolved TEV" do you mean it's still a texture combiner unit (that's pretty much impossible given what we've seen it pull off already) or that it's simply a more modern fixed function unit, with DX10/11 fixed functions instead of DX7 ones?

The latter, not too sure how they would have done it though. We know that what we've seen so far isn't possible on a bog standard 176GFlops/352GFlops GPU on such a low wattage.

And with regards to the CPU discussion above code written specifically for in-order CPUs that handle audio needs major changes to run on an out-of-order CPU and DSP unless you want the out-of-order CPU to brute force some of the code (which explains some of the performance issues with ports).

And conina, the gap in power wasn't a huge problem. Treyarch proved this with Modern Warfare, a game that was supposedly impossible to run on the Wii. The Wii could have run downscaled ports of the likes of Dead Space, Fallout, Bioshock etc with less or even no AA, less environmental and particle effects, more geometry and texture pop-in, lower native resolution and lower quality textures if Nintendo had a GPU using traditional programmable shaders.

And the same can be said about the Wii U receiving PS4 and One ports too.



snowdog said:
curl-6 said:

When you say an "evolved TEV" do you mean it's still a texture combiner unit (that's pretty much impossible given what we've seen it pull off already) or that it's simply a more modern fixed function unit, with DX10/11 fixed functions instead of DX7 ones?

The latter, not too sure how they would have done it though. We know that what we've seen so far isn't possible on a bog standard 176GFlops/352GFlops GPU on such a low wattage.

And with regards to the CPU discussion above code written specifically for in-order CPUs that handle audio needs major changes to run on an out-of-order CPU and DSP unless you want the out-of-order CPU to brute force some of the code (which explains some of the performance issues with ports).

And conina, the gap in power wasn't a huge problem. Treyarch proved this with Modern Warfare, a game that was supposedly impossible to run on the Wii. The Wii could have run downscaled ports of the likes of Dead Space, Fallout, Bioshock etc with less or even no AA, less environmental and particle effects, more geometry and texture pop-in, lower native resolution and lower quality textures if Nintendo had a GPU using traditional programmable shaders.

And the same can be said about the Wii U receiving PS4 and One ports too.


@ bold

Why not? Equivalent PC graphics cards with similar power were already pushing visuals beyond what the 360/PS3 were doing and are more or less in line with the WiiU's graphics. That's discounting the added benefit of optimisations that devs have with a fixed platform.