By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - If Wii U could do PS4 graphics at 720p, would that be enough for you?

Tagged games:

 

?

Yes, that'd be satisfactory 130 31.18%
 
Still wouldn't be good enough 84 20.14%
 
Wii U's graphics are already fine 203 48.68%
 
Total:417
Zekkyou said:
JNK said:

hm not really.

I guess you do underestimate the wii u or overestimate the ps4.

Here a small comparison for you

[video]

Keep in mind watch dogs does run 900p30 fps on ps4 and 1152x648p25fps on wii u.

Not to much differences huh?

The Ps4 would have much more troubles to run 720p30fps wii u games in 1080p60fps as the other way around.

If you insist on using YT for comparisons like this (which is fundamentally flawed in the first placed; it'll always favor the lower-res version due to the compression), at least use those by someone like DF. Their videos are a lot better than the one you linked ^^ - https://www.youtube.com/watch?v=xkxnVwrZFVI

Even with the compression i can see the IQ difference between them here quite clearly. The jaggies at 0:47... *shudders*

yeah there are some differences.  but this game is far from 1080p60 fps on ps4. How many 1080p60fps games are there on ps4? They all look only slightly better then 720p30fps games on last gen. And wii u is slightly more powerfull. it would definitly work,but scince wii u is bad in slaes we will never see.

 

I doubt ps4 could hande xenoblade x in 1080p60fps without any problems.



Around the Network

Better powered hardware would attract 3rd party games. Better powered hardware would attract a bigger user base. Better powered hardware would have better games.

So to answer your question OP, no 720p wouldn't be enough but yes having the best hardware and games would overthrow PS4 and XboxOne.



JNK said:

yeah there are some differences.  but this game is far from 1080p60 fps on ps4. How many 1080p60fps games are there on ps4? They all look only slightly better then 720p30fps games on last gen. And wii u is slightly more powerfull. it would definitly work,but scince wii u is bad in slaes we will never see.

I doubt ps4 could hande xenoblade x in 1080p60fps without any problems.

1080p/60fps is somewhat irrelevant to me in most contexts, so i'm not really bothered :p I'd usually rather a developer target 1080p/30fps and use the additional resources for other things (I would expect WD to be at least 1080p/30fps were it a first party title, but alas).

As for XCX, you'd have to ask Tachi if 1080p/60fps would be possible on the PS4.



Danman27 said:
curl-6 said:
Danman27 said:

Lowest res game on the ps4 is 900p

Magnetic: Cage Closed runs at 720p on PS4.

http://gamingbolt.com/magnetic-cage-closed-to-run-at-720p60fps-on-ps4xbox-one-devs-not-fans-of-cinematic-experience

And based on the screen shot, it's very clear that it isn't because the ps4 isn't capable of doing so, it's because it's an indie dev that isn't good at optimising for either console. 

Even so, 900p is NOT the lowest res on a PS4 game.



curl-6 said:
bonzobanana said:

One of the key points about the wii u spec that can not be questioned is the 12.8gb/s memory bandwidth because the memory chips are clearly labelled and branded. So we know whatever the wii u does it does it by only transferring a maximum of 12.8gb/s.  That figure of 12.8gb/s is shared between both the operating system side and the game side and the only additional bandwidth is in the 32MB of fast memory built into the gpu. 12.8gb/s  is tiny compared to the 192gb/s of a ps4 for example. Memory bandwidth is a good rough guide regarding a console's performance because it would have been selected on the basis of the requirements of both the gpu and cpu of the system and how much data they can move. Clearly not a lot with the wii u sadly.

Which is a pretty big boost because (A) 32MB is overkill for a 720p system, 360 got by with only 10MB, even after triple buffering to eliminate all screen tearing, there is still eDRAM to spare on Wii U, which brings us to (B) both the CPU and GPU have access to the eDRAM for operations that require fast memory access.


Remember that 32MB is there also as the 1T memory in wii mode too and must electronically appear to be identical and that memory was more about low latency than memory bandwidth. The wii u has to read and write constantly to 2GB of memory with a bandwidth of 12.8gb/s. You can put some speed critical parts into the 32MB of edram memory especially the frame buffer but ultimately all the wii u can do is move data about in the 2GB of main memory at 12.8gb/s. 32MB is only 1/64th of actual memory. For comparison the main memory of 360 is 22.4gb/s and the bandwidth between edram and gpu is 32gb/s. PS3 is 22.4gb/s to graphics memory and 25.6 gb/s to main memory. So even the ps3 can do 4 operations approx to its main and graphic memory in the time the wii u can do one to its DDR memory. Shinshei said the 360 and wii u edram was comparable but the increase in size made it far more useful. To meet the wii's memory bandwidth requirements it needs to be about 4 gb/s approx while the wii u gpu is operating at 240mhz so probably about 10gb/s absolute minimum when operating at full speed. How much bandwith has it really got? No one seems to know so its an area of wild speculation but it has to appear identical to 1t ram when the gpu is slowed to 240mhz for wii mode. If Shinshei is saying its similar then perhaps 32gb/s but I've seen 40gb/s and 60gb/s also mentioned. This is probably linked to the debate about whether the wii u gpu is 176 gflops or 352 gflops. The wii u isn't performing anywhere near the claimed 352 gflops figure and not consuming the power either and would be horrifically bottlenecked by 12.8gb/s memory access. I'm thinking the eDRAM speed is actually at the higher end of speculation and the gpu is at 176 gflops this makes sense regarding the performance of the console.

The xbone needs about 125 gb/s memory bandwidth to match the 176gb/s of ps4 considering it has a 40% weaker gpu than ps4 and most of main memory is occupied with graphics data movement so its not bottlenecked. xbox one memory is 68 gb/s plus there is also the issue that many xbox one games suffer from more frame drops even when run at a lower resolution despite a slightly faster cpu setup than ps4. So clearly the 32MB of SRAM is not enough to fulfill the shortfall despite being far,far faster than eDRAM. 

The 10MB edram on 360 was enough to give it a frame rate advantage over many ps3 games slightly as long the resolution was a fit for 10MB, the ps3 supports a much wider range of 1080p and 3D games that require larger frame buffers. 

Also that 10MB edram for 1/2GB in 360 is a higher ratio than 32MB for 2GB admittedly the operating system may have less call on time critical memory access although I'm only guessing that. 

Let's face it the consoles without small pockets of high speed memory but with reasonable bandwidth for main memory and/or graphics memory achieve a lot more. The PS4 clearly does, as does the ps3 despite having a much weaker gpu than 360 for properly optimised games as did the original xbox the generation before. A small amount of high bandwidth memory is really restricting for ambitious games and seems to have a common symptom, frame rate drops. Both xbox one and wii u suffer from it horribly. The 360 doesn't but then its main memory wasn't slow it was almost twice the bandwidth of wii u memory so for the 360 the eDRAM was a performance bonus that improved the console's games it wasn't used as a solution to using cheaper slower main memory.

Also the wii punched above its weight despite only having a 11gflops gpu (xenoblade) and that design has a dedicated 1meg texture cache, 2meg frame buffer, 24MB 1T-RAM and 64GB of DDR buffer memory that's 4 pools of memory in addition to all the other smaller caches in its design. Obviously mainly inherited from the gamecube it was based on but still huge bandwidth all things considered.

S



Around the Network

Graphics aren't the reason that i don't own the Wii U. That stupid tablet controller is what turned me off from wanting the system. At this point, i'm just waiting for the next Nintendo console.



Nope as the PS4 is lacking. If it can't play 1080p at 60fps with high end dx11 features than it is poor.



bonzobanana said:
curl-6 said:

Which is a pretty big boost because (A) 32MB is overkill for a 720p system, 360 got by with only 10MB, even after triple buffering to eliminate all screen tearing, there is still eDRAM to spare on Wii U, which brings us to (B) both the CPU and GPU have access to the eDRAM for operations that require fast memory access.


Remember that 32MB is there also as the 1T memory in wii mode too and must electronically appear to be identical and that memory was more about low latency than memory bandwidth. The wii u has to read and write constantly to 2GB of memory with a bandwidth of 12.8gb/s. You can put some speed critical parts into the 32MB of edram memory especially the frame buffer but ultimately all the wii u can do is move data about in the 2GB of main memory at 12.8gb/s. 32MB is only 1/64th of actual memory. For comparison the main memory of 360 is 22.4gb/s and the bandwidth between edram and gpu is 32gb/s. PS3 is 22.4gb/s to graphics memory and 25.6 gb/s to main memory. So even the ps3 can do 4 operations approx to its main and graphic memory in the time the wii u can do one to its DDR memory. Shinshei said the 360 and wii u edram was comparable but the increase in size made it far more useful. To meet the wii's memory bandwidth requirements it needs to be about 4 gb/s approx while the wii u gpu is operating at 240mhz so probably about 10gb/s absolute minimum when operating at full speed. How much bandwith has it really got? No one seems to know so its an area of wild speculation but it has to appear identical to 1t ram when the gpu is slowed to 240mhz for wii mode. If Shinshei is saying its similar then perhaps 32gb/s but I've seen 40gb/s and 60gb/s also mentioned. This is probably linked to the debate about whether the wii u gpu is 176 gflops or 352 gflops. The wii u isn't performing anywhere near the claimed 352 gflops figure and not consuming the power either and would be horrifically bottlenecked by 12.8gb/s memory access. I'm thinking the eDRAM speed is actually at the higher end of speculation and the gpu is at 176 gflops this makes sense regarding the performance of the console.

The xbone needs about 125 gb/s memory bandwidth to match the 176gb/s of ps4 considering it has a 40% weaker gpu than ps4 and most of main memory is occupied with graphics data movement so its not bottlenecked. xbox one memory is 68 gb/s plus there is also the issue that many xbox one games suffer from more frame drops even when run at a lower resolution despite a slightly faster cpu setup than ps4. So clearly the 32MB of SRAM is not enough to fulfill the shortfall despite being far,far faster than eDRAM. 

The 10MB edram on 360 was enough to give it a frame rate advantage over many ps3 games slightly as long the resolution was a fit for 10MB, the ps3 supports a much wider range of 1080p and 3D games that require larger frame buffers. 

Also that 10MB edram for 1/2GB in 360 is a higher ratio than 32MB for 2GB admittedly the operating system may have less call on time critical memory access although I'm only guessing that. 

Let's face it the consoles without small pockets of high speed memory but with reasonable bandwidth for main memory and/or graphics memory achieve a lot more. The PS4 clearly does, as does the ps3 despite having a much weaker gpu than 360 for properly optimised games as did the original xbox the generation before. A small amount of high bandwidth memory is really restricting for ambitious games and seems to have a common symptom, frame rate drops. Both xbox one and wii u suffer from it horribly. The 360 doesn't but then its main memory wasn't slow it was almost twice the bandwidth of wii u memory so for the 360 the eDRAM was a performance bonus that improved the console's games it wasn't used as a solution to using cheaper slower main memory.

Also the wii punched above its weight despite only having a 11gflops gpu (xenoblade) and that design has a dedicated 1meg texture cache, 2meg frame buffer, 24MB 1T-RAM and 64GB of DDR buffer memory that's 4 pools of memory in addition to all the other smaller caches in its design. Obviously mainly inherited from the gamecube it was based on but still huge bandwidth all things considered.

S

Wii emulation doesn't mean at all that the memory bandwidth must be the same or even similar.

As to the eDRAM not being enough, at least one dev begs to differ:

http://www.vg247.com/2012/11/05/wii-u-avoids-ram-bottleneck-says-nano-assault-dev/

http://hdwarriors.com/why-the-wii-u-is-probably-more-capable-than-you-think-it-is/

http://thewiiu.com/topic/7747-interesting-article-regarding-cpugpu-in-wii-u/

"The performance problem of hardware nowadays (Interview circa 2012) is not clock speed but ram latency. Fortunately Nintendo took great efforts to ensure developers can really work around that typical bottleneck on Wii U. They put a lot of thought on how CPU, GPU, caches and memory controllers work together to amplify your code speed."

"Nintendo made very wise choices for cache layout, RAM latency and RAM size to work against these pitfalls"

"The Wii U eDRAM has a similar function as the eDRAM in the XBOX360. You put your GPU buffers there for fast access. On Wii U it is just much more available than on XBOX360, which means you can render faster because all of your buffers can reside in this very fast RAM. On Wii U the eDRAM is available to the GPU and CPU. So you can also use it very efficiently to speed up your application."

"Theoretical RAM bandwidth in a system doesn’t tell you too much because GPU caching will hide a lot of this latency."

And again, the GDDR3 in the Wii was not just a buffer. You're thinking of the Gamecube, which used slower DRAM.



I'm already happy with the way Wii U games look.



    

NNID: FrequentFlyer54

JNK said:
DonFerrari said:

We are just discussing if a game 1080p60fps on PS4 and WiiU 720p30fps could use both the same high end assets with both being fully optimized and utilized... What we see is that it couldn't...

The whole premise of this thread is if WiiU could output 720p MP and it can't at the moment (it would need better HW)... but if you don't want to accept the real limitations of WiiU no problem... I won't ever say a PC with 2 Titans 8Gb DDR4 and 8Gb GDDR5 i7 would be marginally better than PS4 only making twice the fps and twice the pixels... it would be doing 4k60fps on ULTRA with some spare while PS4 would have problems for 1080p30fps on HIGH.

I or you may not care about the difference, but saying they don't exist won't help WiiU in the slightest.

watch the vidoe i posted some posts ago and check on yourself. On the comments some people even like wii u version more. There is almost no difference beside fps and resolution. and the game is just 900p30fps on ps4. And the wii port was even made more poor as the ps4 version.


So you basically want to go on make believe or that devs don't make good ports that would be requiring 1/4 of the capacity of PS4 just because?



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."