| spemanig said:
With WWHD, I'm pretty sure that it running at 30fps had more to do with that game being locked to 30fps. Like the framerate won't even unlock of framerate. It had nothing to do with how Nintendo "felt."
How would it be no problem, though? Why does 1080p 30fps automatically translate to a relatively simple transfer to 720p 60fps? How do you know that's the case? What is the precident?
|
When i say "no problem" i mean in regards to it being a possible transition. There are various factors to consider, but for a console like the WiiU (that's mostly GPU reliant), 720p/60fps would in many respects be easier to pull off than 1080p/30fps (which has a pixel count of about 12.5% more). It wouldn't be as simple as the press of a button, various parts of the game itself would need to be reworked slightly (such as animations), but if you have the resources for one you should be able to do the other.
Answer to your second reply:
I'm not an expert, but the topic interests me 
Anyway, something being "bigger" doesn't mean it's more difficult to run. All that really matters (most of the time) is the LOD, and how far a game maintains that quality. You'd be amazed how awful some of the assets open world games use for things far away from the player :p The PS2 would look away in disgust at some of them.
Cel-shading isn't necessarily easier to run, but it does often let you get away with having worse overall assets, which is likely why Nintendo chose it. The lower the quality of the local assets they can get away with, the larger they can make the LOD fields, which in term makes them less noticeable (something most open world games do their best to accomplish).