By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Wii U's eDRAM stronger than given credit?

You can't say what Latte is capable of in the PC world because we don't know enough about it. It's a completely custom GPU who's die-shot bears absolutely no resemblance to any other GPU in existence.

We do know that the Wii U is built from the ground up to be efficient. There's absolutely no reason why the Wii U wouldn't be able to run down-ports from the PS4 and One though due to the points I've already made.



Around the Network

I agree with you... From die shots when looking at amount of SRAM registers it looks like it has 320 SPUs.... So it matches Radeon HD 5550 and 3870 in amount of SPUs but amount of SPUs per cluster is different.

So I am doubtful that it has more...

I only said the mininum of which is capable. RE Relevations would have run on Wii U 1080p30fps with some effort in adapting to its architecture and if it was using every bit of potential that it has then it would have ran 1080p60fps with improved details.



snowdog said:
You can't say what Latte is capable of in the PC world because we don't know enough about it. It's a completely custom GPU who's die-shot bears absolutely no resemblance to any other GPU in existence.

We do know that the Wii U is built from the ground up to be efficient. There's absolutely no reason why the Wii U wouldn't be able to run down-ports from the PS4 and One though due to the points I've already made.


Thats not true, everyone from digital foundry to beyond 3D to techies on GAF have been able to narrow down the Us GPU to a custom 4650/5550 equivalent. Stop trying to imply that its above or beyond these two cards. As a former owner of a 5670 on PC and a WiiU that benchmark feels pretty darn accurate given what that card could do at 900p and everything we've seen (X and Bayo2 included) so far. 

The PS4 abd X1 are custom but they perform identical to their PC equivalents. Tomb raider averages at 50fps on the PS4, it averages at 52fps on a 7870 on Ultra.

It doesn't make sense for nintendo and other devs to handicap the Us visuals when they have its full specs and documentation. Come on.



I predict that the Wii U will sell a total of 18 million units in its lifetime. 

The NX will be a 900p machine

snowdog said:
We do know that the Wii U is built from the ground up to be efficient. There's absolutely no reason why the Wii U wouldn't be able to run down-ports from the PS4 and One though due to the points I've already made.

Developers have access to 5 GB RAM on XBO and PS4, on Wii U only 1 GB RAM is available for games. With optimized OS footprints they can probably get to 6 GB available RAM on XBO and PS4 and 1.5 GB available RAM on Wii U.

Even if the PowerPC-CPU is less RAM hungry than x86-systems... will 1/4 to 1/5 of the RAM suffice for games, which take full advantage of the available RAM on XBO/PS4 without too many compromises? Is it even worth it? Most Nintendo fans will accuse the devs for the "shuddy ports" anyways and won't buy these games.

Multiplatform games will die on Wii U when 360 and PS3 support eventually ends.



Conina... Stop with the bias...

Nintendo fans have right ti call a Xbox 360/Playstation 3 port "shoody" when it looks and/or performs worse because it is shoody when its not properly optimized for its architecture. Need For Speed: Most Wanted U is superior port and so is Trine 2: Directors Cut because developers put some effort in it...

Third parties dont do well for most part because they barely support their games on their platform, they dont use the hardware of Wii U to improve visual fidelity and/or gameplay experienc by using gamepad also bad timing when with poorly timed release dates.

Anyway we know that on XO and PS4 tge maximum availabe RAM is 5 GB and OSs use 3 GB for smoother experience experience with UI and OS and features except if you want slower response timeand I wouldt be surprised if 1GB of RAM is reserved for Streaming to temporarility store recorded footage as it gets compressed for streaming on the go and doesent it capture like 5-15 last minutes.of gameplay footage temporarily?



Around the Network

We can only judge the Wii U's hardware in comparison to the PS4 and One after Project CARS is released. Unfortunately it's probably going to be the only third party title where the Wii U is going to be treated on an even playing field in terms of time, care and attention.



snowdog said:
We can only judge the Wii U's hardware in comparison to the PS4 and One after Project CARS is released. Unfortunately it's probably going to be the only third party title where the Wii U is going to be treated on an even playing field in terms of time, care and attention.

We can only hope. What we've heard from them so far certainly sounds encouraging, but until I see actual footage and screens, I'm trying not to get my hopes up; I've been burned too many times.



snowdog said:
Games such as Trine 2, Most Wanted, The Wonderful 101, Super Mario 3D World, Mario Kart 8, X, Bayonetta 2 and SSBU say you're wrong.

There have been plenty of games at 720p native, 60fps with v-synch enabled. The latter of which is the most important.

It lies somewhere between the PS3 and Xbox One in terms of power, and if Nintendo have evolved the TEV Unit the way I think they have it explains several things: 1) Latte's ALUs being twice the size they should be 2) The low power draw 3) The performance of Latte with games in 720p native, 60fps with v-synch enabled 4) Negative comments from devs who haven't actually used the thing and have just judged it by looking at the spec sheet.


It would be a big stretch to assume that the WiiU uses TEV.
And is only speculation at this point.

Conina said:

Trine 2 on Wii U is 30 fps, not 60fps: http://www.eurogamer.net/articles/digitalfoundry-trine-2-face-off

The PS4 version runs in 1080p 60 fps (2D) or 720p 60 fps (stereoscopic 3D): http://www.eurogamer.net/articles/digitalfoundry-vs-trine-2-on-ps4

That's already quite a steep performance gap:

  • Wii U: 1280 x 720 x 30 = 27,6 MPixel per second
  • PS4 2D: 1920 x 1080 x 60 = 124 MPixel per second (4.5x of the Wii U)
  • PS4 3D: 1280 x 720 x 60 x 2 = 110 MPixel per second (4x of the Wii U)

But it gets even better:

"We have to go by the 3D limitations at the moment - so the regular resolution is 1080p and 60fps, but in 3D the resolution gets dropped to 720p60. The game actually still runs at 1080p internally. So in the future the game may even automatically be able to run and output at 1080p60."

  • PS4 3D (native rendering): 1920 x 1080 x 60 x 2 = 249 MPixel per second (9x of the Wii U)

"One benefit of the current 720p situation is that you get improved anti-aliasing in the form of downsampling the 1080p image to 720p which, combined with FXAA, produces a very clean image that is upscaled very well by our display."

If Sony should support 4K gaming with a firmware update, Trine 2 could be played in 4K 30 fps... the resources are the same as for the current 3D-rendering:

  • PS4 2D 4K 30 fps: 3840 x 2160 x 30 = 249 MPixel per second (9x of the Wii U)

"I can't think why we technically couldn't support 3840x2160 mode at 30fps (with the stereo rendering quality). Increasing the resolution while rendering less often would end up to the same amount of pixels being rendered"

There is more to a game than just the resolution and framerate that can lower performance and increase graphics.
However, outside of super high-end PC gaming, 4k isn't that important or even common place.

With that in mind, if the Xbox 360 and Playstation 3 also had the correct interconect that could handle 4k output, then they too could have *potentially* had games running at 4k if they were simple enough.

Running at 4k resolutions is stupidly hardware intensive, the sad fact is, you won't be getting a game like Crysis running at 4k on any of the consoles, they don't have the Ram, Bandwidth or the compute resources to achieve it, only the PC does.

hated_individual said:
Naughty Dog discussed PlayStation 4s architecture and they talked about latency...

PlayStation 4 has GDDR5 which is main and only RAM, they said that it would take 200 cycles from L2 cache of CPU to RAM and vice versa also same for GPU to RAM thus it would take 800 cycles to make a full lap from CPU to GPU and back with HSA/HUMA while Xbox One has no HSA/HUMA thus would have 1200 cycles if it was using same type of RAM.

WII U on the other hand would do it in 600 cycles or less if it used same main RAM as PS4 since CPU and GPU have direct access to each other thanks to eDRAM in Wii U GPU thus it can directly send processed data to CPU if needed. That is one of significant edges that WII Us archictecture has over competition and that edge that it has can be considered superior form of HSA and something that AMD might use in future if viable.

Latency and GPU's isn't a problem, they're not impacted by latency all that much, it's bandwidth that they care about, CPU's however are the reverse, depending on the architecture of course.
Intel and AMD spend 10's of millions of transisters (Maybe even 100's of millions.) on their CPU's to hide latency, hence why it's not such a big deal for the Playstation 4 and Xbox One.

The WiiU doesn't have that luxury as it uses PowerPC and I'm actually not entirely sure on what IBM has done to hide latencies.

As for hUMA, the Xbox One doesn't actually support it like the Playstation 4 does to my knowledge, however it does have a comparible implementation.
Keep in mind that hUMA essentially allows for the CPU and GPU to view and alter and share the same data sets.

hUMA/HSA is essentially a step to something greater, which is going to require that technology at it's foundation to even work.

The WiiU also has a comparible implementation to hUMA, via it's eDRAM where both the GPU and CPU have coherancy, I'm not entirely sure if they share the same privelage in system memory however, so I won't bother on speculating about that.

In the end latency and bandwidth is only one problem amongst many when it comes to processor and graphics performance, they shouldn't be construed as the single most important factors which seems to be stupidly common place.

CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxAzyYOAQ9.99
CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxAzyYOAQ9.99
CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxA


--::{PC Gaming Master Race}::--

Pemalite said:
snowdog said:
Games such as Trine 2, Most Wanted, The Wonderful 101, Super Mario 3D World, Mario Kart 8, X, Bayonetta 2 and SSBU say you're wrong.

There have been plenty of games at 720p native, 60fps with v-synch enabled. The latter of which is the most important.

It lies somewhere between the PS3 and Xbox One in terms of power, and if Nintendo have evolved the TEV Unit the way I think they have it explains several things: 1) Latte's ALUs being twice the size they should be 2) The low power draw 3) The performance of Latte with games in 720p native, 60fps with v-synch enabled 4) Negative comments from devs who haven't actually used the thing and have just judged it by looking at the spec sheet.


It would be a big stretch to assume that the WiiU uses TEV.
And is only speculation at this point.

CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxAzyYOAQ9.99
CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxAzyYOAQ9.99
CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxA

There is no way in hell the Wii U uses a TEV unit. That's a DX7-era fixed function register combiner unit.

Devs have said Wii U has DX10/11 equivalent features, which puts it a decade ahead of that.



The 360's eDRAM was theoretically fast enough to never be a bottleneck in the pipeline so even if it was 2 or 4 times faster it would not change it's performance as part of the system.

Probably for Wii U it's the same. The bottleneck on the GPU is likely in the shaders. Wii U also seems bottlenecked in it's CPU, and SIMD throughput.

Pointing at singular bits of the hardware is kind of pointless, for example if I told you the PS4 & X1's CUs have bandwidth of 256 bytes/cycle for a total of 3433GB/s on PS4, would that blow your mind?

So, PS4/X1 CUs are stronger than given credit?  Who cares.  Nobody really cares what eDRAM Wii U has, only that the games it can run still look like Gen 7 games.



My 8th gen collection