By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Wii U's eDRAM stronger than given credit?

As I've already mentioned the raw floppage doesn't tell the whole story, and even if it did opinion is divided between 176GFlops and 352GFlops.

The game titles I've mentioned are a clear step ahead of anything the PS3 and 360. If you don't agree with that then I seriously suggest you take a trip down to Specsavers.



Around the Network
snowdog said:
As I've already mentioned the raw floppage doesn't tell the whole story, and even if it did opinion is divided between 176GFlops and 352GFlops.

The game titles I've mentioned are a clear step ahead of anything the PS3 and 360. If you don't agree with that then I seriously suggest you take a trip down to Specsavers.

its not divided anymore, everybody on neogaf and beyond3d agrees its 176 gflops, only a few die hard fanboys, that still bring 352 gflops, and as fo those games you mentioned, please don't make me laugh. the last of us, forza horizone, halo 4, uncharted 3, and 2, red dead redemption and GTAV are different league graphically, i can post direct feed pics and those games look so much better. if you think those games you mentioned are impressive it makes me think you probably only owned a wii last gen.



I've already explained why PS3 and 360 ports have had performance issues. Here it is again:

The console architecture is different (not to be confused with CPU architecture), as is the console architecture for all 3 consoles this generation.

The PS3 and 360 are 'CPU heavy', meaning that most floating point work is done by the CPU. The Wii U, PS4 and One are 'GPU heavy'.

In addition to floating point work the Cell and Xenon also have to have at least 1 SPE and 1 thread entirely dedicated to audio. The Wii U, PS4 and One all have DSPs.

Cell and Xenon are in-order execution CPUs, the CPUs in the Wii U, PS4 and One are all out-of-order execution CPUs.

All of the above means that for ports that haven't been well optimised Expresso is having to brute force a lot of code.

Developers are going to find it a great deal easier porting between the Wii U, PS4 and One than it currently is to port between the Wii U, PS3 and 360.

And if that wasn't enough you've also got to remember that the vast majority of ports have been done by B teams.

We're only going to see what it's capable of on a level playing field when Project CARS is released. And incidentally, the PS3 and 360 SKUs were canned because neither console was powerful enough to do the game justice.



Aaaaaaand Ninjablade's 5th alt is permabanned in 10 posts. Jeez, you'd think Iwata ran over his dog or something.

There's just so much emotion on both sides of this debate that it seems we'll never be able to have a discussion about Wii U's graphical capabilities without a shitstorm erupting eventually.

All I'll say about the eDRAM is this; Shin'en have said it was critical to their new engine, which we'll be seeing in action soon with Art of Balance and FAST Racing Neo. Looks rather promising though:



snowdog said:
Games such as Trine 2, Most Wanted, The Wonderful 101, Super Mario 3D World, Mario Kart 8, X, Bayonetta 2 and SSBU say you're wrong.

There have been plenty of games at 720p native, 60fps with v-synch enabled. The latter of which is the most important.

Trine 2 on Wii U is 30 fps, not 60fps: http://www.eurogamer.net/articles/digitalfoundry-trine-2-face-off

The PS4 version runs in 1080p 60 fps (2D) or 720p 60 fps (stereoscopic 3D): http://www.eurogamer.net/articles/digitalfoundry-vs-trine-2-on-ps4

That's already quite a steep performance gap:

  • Wii U: 1280 x 720 x 30 = 27,6 MPixel per second
  • PS4 2D: 1920 x 1080 x 60 = 124 MPixel per second (4.5x of the Wii U)
  • PS4 3D: 1280 x 720 x 60 x 2 = 110 MPixel per second (4x of the Wii U)

But it gets even better:

"We have to go by the 3D limitations at the moment - so the regular resolution is 1080p and 60fps, but in 3D the resolution gets dropped to 720p60. The game actually still runs at 1080p internally. So in the future the game may even automatically be able to run and output at 1080p60."

  • PS4 3D (native rendering): 1920 x 1080 x 60 x 2 = 249 MPixel per second (9x of the Wii U)

"One benefit of the current 720p situation is that you get improved anti-aliasing in the form of downsampling the 1080p image to 720p which, combined with FXAA, produces a very clean image that is upscaled very well by our display."

If Sony should support 4K gaming with a firmware update, Trine 2 could be played in 4K 30 fps... the resources are the same as for the current 3D-rendering:

  • PS4 2D 4K 30 fps: 3840 x 2160 x 30 = 249 MPixel per second (9x of the Wii U)

"I can't think why we technically couldn't support 3840x2160 mode at 30fps (with the stereo rendering quality). Increasing the resolution while rendering less often would end up to the same amount of pixels being rendered"



Around the Network

I didn't say Trine 2 ran at 60fps, 'plenty' does not mean every game I named otherwise I would have said 'all of the above'. I mentioned Trine 2 because the developers specifically stated that the Wii U SKU wouldn't run on the PS3 or 360 without being downgraded.

And why are you mentioning the PS4..? We were discussing the difference between the Wii U and the last generation consoles, not the PS4 and One.



Conina said:
snowdog said:
Games such as Trine 2, Most Wanted, The Wonderful 101, Super Mario 3D World, Mario Kart 8, X, Bayonetta 2 and SSBU say you're wrong.

There have been plenty of games at 720p native, 60fps with v-synch enabled. The latter of which is the most important.

Trine 2 on Wii U is 30 fps, not 60fps: http://www.eurogamer.net/articles/digitalfoundry-trine-2-face-off

The PS4 version runs in 1080p 60 fps (2D) or 720p 60 fps (stereoscopic 3D): http://www.eurogamer.net/articles/digitalfoundry-vs-trine-2-on-ps4

... [many numbers]


You really take all those numbers out of one game?

4K graphics on PS4? The console already struggles with 1080p/30FPS. Even Killzone PS4 doesn't run in 1080p. Besides the fact that the PS4 architecture is very "easy" to develop for (as intentioned by Sony). So developers are able to use the full power of the PS4 from the beginning. There will no big learning curve, like the developers on PS3 had, as they learned to use the Cell-CPU. PS4 in two years from now will not look much better than PS4 games today.

The Wii U has obviously less power than the PS4. But its architecture is much more complicated. So future Wii U games will show the learning curve of the developers. The Wii U has no hidden magic, but it provides some special hardware tweaks that will provide much more power for developers who are willing to invest some time in understanding the hardware. Wii U will never show PS4 quality like graphics, but it will get nearer to that over the years but it will never reach it. And the PS4 will never show High-End-PC graphics, because it is "only" a mid range gaming PC hardware wise.



snowdog said:
And why are you mentioning the PS4..? We were discussing the difference between the Wii U and the last generation consoles, not the PS4 and One.

Who is "we"? Many in this thread are also discussing the difference between Wii U and XBO or PS4, some claim that the Wii U performance is closer to XBO than to 360/PS3.

Even in the OT a comparison to the XBO is included: "For comparison, the Xbox One runs at about 170GB per second of bandwidth between DDR3 and eSRAM."




z101 said:

You really take all those numbers out of one game.

Well, Trine 2 was used many times in this thread to show the power of the Wii U, just wanted to give a bit of context how much far ahead the PS4-version is.

z101 said:

4K graphics on PS4? The console already struggles with 1080p/30FPS.

Naturally, 4K-gaming will be impossible for most PS4 games.

But less demanding games like Trine 2 or Rayman Legends and many indie games should run totally fine in 4K 2D, if Sony should finally support that resolution.

As said, Trine 2 renders internally two video streams in 1920x1080 with rocksolid 60 fps in stereoscopic mode:

One videostream in 4K and 30 fps needs exactly the same resources (RAM, processing power), so Trine 2 should deliver rocksolid 30 fps in 4K 2D. Rayman Legends is even less demanding than Trine 2, perhaps it could run in 4K 2D with 60 fps.



Rooki(Ninjablade) has seriously flawed arguments and we can see through his posts that he has confirmation bias also his comparison of games is unfair since he used super-sampled bullshots and GTA V shot is not ingame and ingame visuals on Xbox 360 and PlayStation 3 of GTA V arent not close to those bullshots and Dark Souls 2 does not nothing like the preview which is now controversial.

Anyone claiming that Wii U is weaker than Xbox 360 and PlayStation 3 is in denial and yet we who disagree with them get accused of behaviour that they express...

Wii Us GPU in PC world is capable of playing Far Cry 2 at 720P30FPS Ultra settings, Resident Evil 5 at 1080P30FPS Very High preset settings and Dirt 2 at 1080P25FPS Ultra settings.

If those games were on Wii U and optimized for its architecture then those games would yield 50-100+% more FPS because of vastly reduced overhead and Wii U has an edge over PlayStation 4s architecture when comes to latency and bandwidth which increases efficiency and performance.

Naughty Dog discussed PlayStation 4s architecture and they talked about latency...

PlayStation 4 has GDDR5 which is main and only RAM, they said that it would take 200 cycles from L2 cache of CPU to RAM and vice versa also same for GPU to RAM thus it would take 800 cycles to make a full lap from CPU to GPU and back with HSA/HUMA while Xbox One has no HSA/HUMA thus would have 1200 cycles if it was using same type of RAM.

WII U on the other hand would do it in 600 cycles or less if it used same main RAM as PS4 since CPU and GPU have direct access to each other thanks to eDRAM in Wii U GPU thus it can directly send processed data to CPU if needed. That is one of significant edges that WII Us archictecture has over competition and that edge that it has can be considered superior form of HSA and something that AMD might use in future if viable.