Quantcast
Locked: Wii U's eDRAM stronger than given credit?

Forums - Nintendo Discussion - Wii U's eDRAM stronger than given credit?

supernihilist said:
you cant compare single units performance to a closed environment like a console

a console will always perform better than a standard benchmark cause of the optimizations

Not true. Tomb Raider on PS4 == 50fps. Tomb Raider on 7870 (1920 x 1200) at Ultra == 52fps. 

XBox360 has an equivalent to a Radeon X1950. PS3 to a Nvidia 7800GTX. These cards handle console games at the same resolution and framerate with minor differences

Case in point Bioshock1 on a X1950 Pro. 1024x768. High. No AA == ~40ps. Reasonable to expect the 30fps at 720p no? Probably with muddier textures because of limited RAM.

That "myth" isnt true, see below

 

And post from someone much smarter than me on GAF

======================================================

A big problem with all the talk of console optimization is that it usually lacks a proper understanding of what, exactly, "optimization" is. Consoles do tend to have thinner API layers than PCs, but that doesn't amount to much at all. Any of the absurd claims people put forward of 2x performance improvements are just deeply insulting at that point to the software engineers that develop DirectX, OpenGL, and GPU drivers. There isn't a chance in hell that these professional, mature APIs waste anything close to half of a GPU's cycles. Seriously consider how ridiculous it is to think so; such a solution would be wholly inept and quickly replaced by something superior given there are companies and divisions of companies whose revenue and livelihood largely depend on processing performance. The fact of the matter is that the PC platform is single-handedly driving the advancement of high-performance graphical processors these days, it's kind of important that APIs exist which do not cripple performance to such a vast degree.

 

The issue of optimization is not an underlying software issue (except perhaps for draw calls). However, if developers do know a given platform has fixed and understood specifications, they could very much benefit from tweaking rendering options to suit the strength of that hardware (i.e. tone down settings that kill performance, design clever streaming and occlusion culling solutions, etc.). This is not completely impossible on PCs either. Optimization via streaming/occlusion culling type stuff is design-level and benefits PCs as well by not wasting precious resources on unseen/unnecessary assets. The problem arises from rendering settings. These, too, are tweakable on PCs. The difference is that it's on the end-user to determine what balance they want to go for and, as with late-generation multiplatform games sometimes, the baseline of PC hardware is so far ahead feature and power wise that PC versions have more advanced settings enabled by default. Then people confuse the pared-back and optimized console setting balances with increased performance over time. It is not this, that is impossible. The pure computational capacity of a processor is fixed, period. Optimization is simply a matter of making the most of perceptible graphical differences (because some effects and rendering methods produce an arguably small difference to a lot of people, yet incur a huge penalty to performance; see: SSAA/FSAA, AO, soft shadows, fully dynamic lighting, tesselation, etc.) and given that PC games do not ever have an explicit "console-level settings" option, people tend to assume false equivalencies. If a GPU stronger than a console's is struggling with a game the console does not, it typically means one of two things; the PC GPU is rendering greater settings or the game was poorly-ported to the PC.

 

 

 




I predict that the Wii U will sell a total of 18 million units in its lifetime. 

The NX will be a 900p machine

Around the Network

Time to put a stop to this endless circle-jerk i'd say (might help keep Ninjablade away).

Locking.



Monster Hunter: pissing me off since 2010.