By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Why Final Fantasy 15 can easily run on Wii U

Zero999 said:

I agree on the A.I. that many use as an excuse hardware demand but it's been on the same level for a while.

But a wii u version would easily run it on 720p and maybe 60fps too, without any many downgrades.

All of this because you say so, of course.



Around the Network

This just in: GTAV for Wii U confirmed before FFXV and KH3. Congratulations. While I was on Vacation, I traveled to Japan and went to SE's HQ in Shinjuku. Then I printed out this thread along with an analysis of the estimated profits that SE would lose if they didn't port it to the Wii U, and they not only apologized to me but promised that FFXV has as good a chance of coming to the wii u as FF Versus XIII had of coming to the PS3.
I'm posting here to apologize to Zero who was actually right!
Now that you've succeed are you going to try to get EA to support Nintendo as well?



In this day and age, with the Internet, ignorance is a choice! And they're still choosing Ignorance! - Dr. Filthy Frank



Zero999 said:
SubiyaCryolite said:
Zero999 said:

NO. the wii was 20x+ less powerfull than ps360. wii u vs xone vs ps4 is more like 1 vs 2 vs 3.

and it's not only raw power (although i'd guess 600 gflops on wii u), the wii u has all the modern effects and tools that ps360 lack, wich makes the difference.

Further more as someone who actually owned a 620GFLop card (Radeon 5670HD GDDR5) and who currently owns a WiiU I can completely rule that out. Virtually impossible. If it were every 7th gen port would be 900p30 out the box with no "optimisation" required. 900p60 would be true for Nintendos 1st party games, stop making things up Zero.

A DX10 based 4850HD is still more powerful than a DX11 based 5670HD so stop acting like thats magical fairy dust. DX11 features mean nothing if the card cant use them in real time at acceptible performance levels.

bold: because a non customized graphic card on a PC performs in the exact same way as on a console, right?

"A direct comparison with the PC version set to the ultimate quality preset reveals some large discrepancies between the Definitive Editions compared with the full-fat PC experience. Aspects such as tessellation are missing on the characters and environments, with some of these elements appearing more blocky on the PS4 and Xbox One as a result. Meanwhile, motion blur is used much more sparingly, while texture resolution is noticeably lower in some cases. On the flip side, all the Definitive Edition graphical extras - such as the dynamic foliage and the impressive sub-surface scattering - are absent from the PC, which represents another (albeit smaller) compromise."

Tomb Raider Definitive Edition average framerate ON PS4 (1920 X 1080)? 53fps

http://www.eurogamer.net/articles/digitalfoundry-2014-tomb-raider-definitive-performance-analysis

Tomb Raider DX11 Ultra on PC average framerate fps on 7870HD (1920 X 1200)? 52FPS

OMFG! Logic! It cant be! Its impossible. With console optimisation it should be 90fps OMG!  *facepalm*

And lastly a X1950 Pro(360 GPU equivalent) running PC games at 360 level (30-45fps, medium, 1280 x 1024) performance 7 years later. Common sense, use it.

You think console customisation is magic fairy pixie dust? Get a clue.

This post was written specifically for people like you.

====================================================================

 

A big problem with all the talk of console optimization is that it usually lacks a proper understanding of what, exactly, "optimization" is. Consoles do tend to have thinner API layers than PCs, but that doesn't amount to much at all. Any of the absurd claims people put forward of 2x performance improvements are just deeply insulting at that point to the software engineers that develop DirectX, OpenGL, and GPU drivers. There isn't a chance in hell that these professional, mature APIs waste anything close to half of a GPU's cycles. Seriously consider how ridiculous it is to think so; such a solution would be wholly inept and quickly replaced by something superior given there are companies and divisions of companies whose revenue and livelihood largely depend on processing performance. The fact of the matter is that the PC platform is single-handedly driving the advancement of high-performance graphical processors these days, it's kind of important that APIs exist which do not cripple performance to such a vast degree.

The issue of optimization is not an underlying software issue (except perhaps for draw calls). However, if developers do know a given platform has fixed and understood specifications, they could very much benefit from tweaking rendering options to suit the strength of that hardware (i.e. tone down settings that kill performance, design clever streaming and occlusion culling solutions, etc.). This is not completely impossible on PCs either. Optimization via streaming/occlusion culling type stuff is design-level and benefits PCs as well by not wasting precious resources on unseen/unnecessary assets. The problem arises from rendering settings. These, too, are tweakable on PCs. The difference is that it's on the end-user to determine what balance they want to go for and, as with late-generation multiplatform games sometimes, the baseline of PC hardware is so far ahead feature and power wise that PC versions have more advanced settings enabled by default. Then people confuse the pared-back and optimized console setting balances with increased performance over time. It is not this, that is impossible. The pure computational capacity of a processor is fixed, period. Optimization is simply a matter of making the most of perceptible graphical differences (because some effects and rendering methods produce an arguably small difference to a lot of people, yet incur a huge penalty to performance; see: SSAA/FSAA, AO, soft shadows, fully dynamic lighting, tesselation, etc.) and given that PC games do not ever have an explicit "console-level settings" option, people tend to assume false equivalencies. If a GPU stronger than a console's is struggling with a game the console does not, it typically means one of two things; the PC GPU is rendering greater settings or the game was poorly-ported to the PC.



I predict that the Wii U will sell a total of 18 million units in its lifetime. 

The NX will be a 900p machine

Zero999 said:

you are free to think the many games from early wii u life that far surpass a machine wth 240 gflops = no basis, but reality says otherwise.

You must be in an alternate reality then. Your "reality" says 600GFLOPs, good for you.



I predict that the Wii U will sell a total of 18 million units in its lifetime. 

The NX will be a 900p machine

Around the Network

Okay, i think that's enough of this.



Monster Hunter: pissing me off since 2010.