By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Wii U games Resolution and fps talk. We need to clarify something.

FrancisNobleman said:
DarkTemplar said:

Unfortunately we cannot just sum the screen resolution and the Wii Pad resolution because is it actually more demanding to rasterize two different viewports (in this case 480p + 720p) than a single viewport with the same amount of pixels.

There are a few reasons for that. For instance, while rasterizing two different viewports the CPU and GPU caches will suffer from more misses because of the Principle of Locality.

wow, finally someone who goes straight to the point.

 

Please could you explain us how much more demanding is a game at 720p@30fps tv plus a different 3D scene @ 480p30fps on gamepad, compared to just 720p30 fps with no screen, just like any other console ?

I am sorry to disappoint you but I am not able to precisely say how much because I do not have enough information about the Wii U hardware or about its SDK. However, I can elaborate more on why it is more demanding...

Memory: you will need a few more megabytes to store both frame buffers (Wii U eDRAM helps a lot on this). If the game streams data to the RAM from a media device (disc or HDD) while the game is running, we will need more band width from both RAM and the device where the game is stored. Finally, to not slow down the game, more RAM may be needed because we will have to load more objects into the memory. For instance, imagine a FPS that loads on the RAM memory the whole floor where a player is. Now imagine that one player is using the TV while the other is on the game PAD and each of them is a different floor, now (suddentlythe game will have to be able to have two floors on the RAM at the same time.

GPU/GPU: I belive that running the rasterization process for two different view ports at the same time may really hurt the GPU and CPU perfomance. I already talked about the "Principle of Locality" but we have to be aware that both GPU and GPU will have to no only make more access to the memory but to process more dada as well. Let's use our FPS sample one more time, know imagine both players are at the same point, however while one is looking north the other is looking south, in orther to rasterize both screens the GPU and CPU will need rasterize all the "visible" objects on both sides of the floor.



Around the Network
FrancisNobleman said:
JoeTheBro said:

It depends on the game. This isn't just a one variable thing like you seem to be thinking.

just go with the example of zombiu then. If now sony came up with a copy of the gamepad for the ps3, would the console run the exact same game as it is on the wii u ? I bet it would be nearly impossible.

Well I already know that the Wii U is a lot stronger than current gen, but yes the PS3 could handle rendering this if it was designed for two screen output.



DarkTemplar said:

I am sorry to disappoint you but I am not able to precisely say how much because I do not have enough information about the Wii U hardware or about its SDK. However, I can elaborate more on why it is more demanding...

Memory: you will need a few more megabytes to store both frame buffers (Wii U eDRAM helps a lot on this). If the game streams data to the RAM from a media device (disc or HDD) while the game is running, we will need more band width from both RAM and the device where the game is stored. Finally, to not slow down the game, more RAM may be needed because we will have to load more objects into the memory. For instance, imagine a FPS that loads on the RAM memory the whole floor where a player is. Now imagine that one player is using the TV while the other is on the game PAD and each of them is a different floor, now (suddentlythe game will have to be able to have two floors on the RAM at the same time.

GPU/GPU: I belive that running the rasterization process for two different view ports at the same time may really hurt the GPU and CPU perfomance. I already talked about the "Principle of Locality" but we have to be aware that both GPU and GPU will have to no only make more access to the memory but to process more dada as well. Let's use our FPS sample one more time, know imagine both players are at the same point, however while one is looking north the other is looking south, in orther to rasterize both screens the GPU and CPU will need rasterize all the "visible" objects on both sides of the floor.

So if both players are looking at the same perspective, it becomes less taxing because the same polygons, animations and texturesare being processed just once ?



FrancisNobleman said:

1080p, 720, 60 frames per second, 30 frames... ok I got it that measuring pixels and refresh rate on your tv is important, but you know...

 

Is the gamepad 480p screen processing running on unicorn tears ?

 

Am I the only one who thinks that, if a game on wii u runs at 720p@30fps on tv, plus 480p@30fps on the gamepad (when pushing a 3D scene like zombiu) should then be considered a 1200p@60fps, adding all processing together, for common sense sake ? 

Of course if the gamepad is displaying just 2D stuff, only the tv should be considered.

No, someone who thought that would suck at math. As 480p has 307200, and 720p displays 921600 pixels (720p=1280x720). Resoltions are shorthand for what the true resolution. So, if you add those two together, you get 1,228,800. That may sound like a lot, until you look at  1200p, which is 1920x1080, which is 2,304,000 pixels, or almost double the resolution. What you're suggesting would be closer to 768p. Also not to mention, that 480p screen is usually running stuff like minimaps. How about before you make a stupid argument, you actually know what you're talking about. 



I doubt the WiiU's controller screen is using much power at all. It's a 2D image that is often fairly static.



Around the Network
Zekkyou said:
I doubt the WiiU's controller screen is using much power at all. It's a 2D image that is often fairly static.

Well that depends on the game, really. There are games that use it for simple 2D maps, but there are also games that use it for more complex stuff, different viewpoints in 3D and whatnot. That's what's stated in the OP, too.



Nintendo Network ID: Cheebee   3DS Code: 2320 - 6113 - 9046

 

FrancisNobleman said:
DarkTemplar said:

I am sorry to disappoint you but I am not able to precisely say how much because I do not have enough information about the Wii U hardware or about its SDK. However, I can elaborate more on why it is more demanding...

Memory: you will need a few more megabytes to store both frame buffers (Wii U eDRAM helps a lot on this). If the game streams data to the RAM from a media device (disc or HDD) while the game is running, we will need more band width from both RAM and the device where the game is stored. Finally, to not slow down the game, more RAM may be needed because we will have to load more objects into the memory. For instance, imagine a FPS that loads on the RAM memory the whole floor where a player is. Now imagine that one player is using the TV while the other is on the game PAD and each of them is a different floor, now (suddentlythe game will have to be able to have two floors on the RAM at the same time.

GPU/GPU: I belive that running the rasterization process for two different view ports at the same time may really hurt the GPU and CPU perfomance. I already talked about the "Principle of Locality" but we have to be aware that both GPU and GPU will have to no only make more access to the memory but to process more dada as well. Let's use our FPS sample one more time, know imagine both players are at the same point, however while one is looking north the other is looking south, in orther to rasterize both screens the GPU and CPU will need rasterize all the "visible" objects on both sides of the floor.

So if both players are looking at the same perspective, it becomes less taxing because the same polygons, animations and texturesare being processed just once ?

Somewhat. When rendering in 3D lots of things can be optimized, but that's only because the devs know the cams are always exactly the same relative to each other. If like you're playing multiplayer and both people look in the same direction, it's just going to do excessive calculations.



FrancisNobleman said:
DarkTemplar said:

I am sorry to disappoint you but I am not able to precisely say how much because I do not have enough information about the Wii U hardware or about its SDK. However, I can elaborate more on why it is more demanding...

Memory: you will need a few more megabytes to store both frame buffers (Wii U eDRAM helps a lot on this). If the game streams data to the RAM from a media device (disc or HDD) while the game is running, we will need more band width from both RAM and the device where the game is stored. Finally, to not slow down the game, more RAM may be needed because we will have to load more objects into the memory. For instance, imagine a FPS that loads on the RAM memory the whole floor where a player is. Now imagine that one player is using the TV while the other is on the game PAD and each of them is a different floor, now (suddentlythe game will have to be able to have two floors on the RAM at the same time.

GPU/GPU: I belive that running the rasterization process for two different view ports at the same time may really hurt the GPU and CPU perfomance. I already talked about the "Principle of Locality" but we have to be aware that both GPU and GPU will have to no only make more access to the memory but to process more dada as well. Let's use our FPS sample one more time, know imagine both players are at the same point, however while one is looking north the other is looking south, in orther to rasterize both screens the GPU and CPU will need rasterize all the "visible" objects on both sides of the floor.

So if both players are looking at the same perspective, it becomes less taxing because the same polygons, animations and texturesare being processed just once ?

This is what we usually call a Corner Case.

When the viewport is similar it becomes more like running the game at twice the framerate. Notice that everything is still being processed twice.



DarkTemplar said:
This is what we usually call a Corner Case.

When the viewport is similar it becomes more like running the game at twice the framerate. Notice that everything is still being processed twice.

 

I think that the dual rendering (talking about 3D output, not when the gamepad only is outputting a 2D menu or something like that) adds a dangerous factor, fluctuating performance. Let's say that both are showing a viewport of the scene. If the output of both are on a similar region, textures and 3D models will be the same and will be cached. Now, if they are showing different parts of the scene, we will have more textures and more 3D models, so performance will likely decrease. My point is that changing the vieport in one of the outputs will make performance decrease without any change on the scene and that is harder to optimize.

Anyway, the slow part is rasterization and it will be done twice unless the image is the same on both. But the situation above made me think about how it will handle cache here. Basically, if the models, textures, etc are different on both outputs at some degree, wouldn't it just kill caching performance, since it would need more cache on GPU to store all that data?



You're all making the Wii U sound like a super powerful machine that overkills the PS3 & 360 in specs, LOL.