By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Would it be possible for the Wii U to run Zelda U at 60fps?

sc94597 said:
baloofarsan said:
This question also comes from a place of ignorance:
Are 60fps and 30fps the only options?
Is it possible to lock the framerate at 40fps or 50fps?

Consoles don't support v-sync, from what I recall. You'll have a lot of screen tearing if you run the game on a 30hz/60hz display at anything that isn't an even or half multiple of 60. Basically the 60hz means the screen refreshes 60 times/second. If you have an fps locked at 30 you'll have a frame for every other refresh. Or if your television supports 30hz, you'll have a frame for every refresh. Now if you have say a framerate of 40 fps, every so often you'll have an extra frame that occurs in between two others (rather than there being a frame every other refresh.) This leads to tearing of the image where your eye sees part of the previous and next frame. There are techniques that can accomodate for this in PC gaming, since they can't really lock framerates for all PC's. These are called v-sync, g-sync, etc. For consoles, however, if a game is 40 fps they'll have to accomodate for this tearing some other way, and in many cases they might not be able to without expending more resources which would bring down their framerate to 30fps anyway. So usually developers either lock their game at 30 fps or 60 fps. And if it is 30 fps they provide more graphical features. 

Also for PAL countries, the televisions have refresh rates of 50 hz and the games run at 25 fps, if I recall correctly. All displays tend to support 24hz, because that is the standard for television and movies. 

Consoles do support v-sync, where did you hear it doesnt? The reason why games arent locked to 45 fps are because on a 60hz monitor 45 frames refreshing 60 times will produce 2 of the same frames for every single frame displayed. This causes jutter looking and feeling worse than 30 fps.



Around the Network
BossPuma said:
sc94597 said:
baloofarsan said:
This question also comes from a place of ignorance:
Are 60fps and 30fps the only options?
Is it possible to lock the framerate at 40fps or 50fps?

Consoles don't support v-sync, from what I recall. You'll have a lot of screen tearing if you run the game on a 30hz/60hz display at anything that isn't an even or half multiple of 60. Basically the 60hz means the screen refreshes 60 times/second. If you have an fps locked at 30 you'll have a frame for every other refresh. Or if your television supports 30hz, you'll have a frame for every refresh. Now if you have say a framerate of 40 fps, every so often you'll have an extra frame that occurs in between two others (rather than there being a frame every other refresh.) This leads to tearing of the image where your eye sees part of the previous and next frame. There are techniques that can accomodate for this in PC gaming, since they can't really lock framerates for all PC's. These are called v-sync, g-sync, etc. For consoles, however, if a game is 40 fps they'll have to accomodate for this tearing some other way, and in many cases they might not be able to without expending more resources which would bring down their framerate to 30fps anyway. So usually developers either lock their game at 30 fps or 60 fps. And if it is 30 fps they provide more graphical features. 

Also for PAL countries, the televisions have refresh rates of 50 hz and the games run at 25 fps, if I recall correctly. All displays tend to support 24hz, because that is the standard for television and movies. 

Consoles do support v-sync, where did you hear it doesnt? The reason why games arent locked to 45 fps are because on a 60hz monitor 45 frames refreshing 60 times will produce 2 of the same frames for every single frame displayed. This causes jutter looking and feeling worse than 30 fps.

I guess I was misinformed. Apparently vsync just wasn't popular on the PS360 because of their ram limitations, and that is why most games didn't support it.  Do consoles support buffering techniques to deal with the duplicate frame problem? 



sc94597 said:

I guess I was misinformed. Apparently vsync just wasn't popular on the PS360 because of their ram limitations, and that is why most games didn't support it.  Do consoles support buffering techniques to deal with the duplicate frame problem? 


Since displayport and g-sync monitors are not supported by consoles, I would say not. Maybe monitors like g-sync will come out soon to support AMD based consoles



Excuse my ignorance, but as both sc94597 and BossPuma are well informed and willing to share your expertise, I do have another question regarding frames per second:
In Smash Bros the main objects are 60 fps while some objects (some of the Pokémon and assist trophies) are 30 fps. Is this technique possible in a game like Zelda?



baloofarsan said:
Excuse my ignorance, but as both sc94597 and BossPuma are well informed and willing to share your expertise, I do have another question regarding frames per second:
In Smash Bros the main objects are 60 fps while some objects (some of the Pokémon and assist trophies) are 30 fps. Is this technique possible in a game like Zelda?


It is possible to limit the frames of ambient objects and its not really noticable at long distances, so yes. Dark Souls 2 did this for enemies far away where their animations were around 15 fps. I would expect  they would do this in Zelda U as part of their optimisation because the world is so huge



Around the Network

I think 720p & 60fps is more possible than 1080p & 60fps(though I hope this is what it would be).

I honestly doubt it would NOT be 60fps.



Being open world isn't necessarily more demanding if you do the correct adjustments to draw distance, LOD and such. Even from a CPU standpoint, unless you are going for a high NPC count (AC: Unity), it won't give you more trouble. Remember, San Andreas had a massive world and was running on a PS2. You just scale for what you have.

It's always possible to run at the resolution and/or framerate you want. You just have to make sacrifices. Want to go to 1080p? You will need 2.25x more performance than in 720p or you will need to take a hit at the framerate. Want 60 fps? Then you have two things to worry:

- GPU: you will have half of the time to render a frame. Decrease resolution, cut effects ore remove AA until you reach it.

- CPU: you will have half the time to do your operations. Simplify AI or cut the amount of NPCs, simplify physics, etc, until you reach it.

Remebering, resolution is a GPU only problem. Framerate demands that both CPU and GPU can do the job. It's not about "can it do it", it's more like "what I have to cut to reach it is worth it?". Framerate isn't king as people always try to say. It's one of the things that do the experience.

Resolution is important, it's image clarity. Lowering it means that you will need a better (and more demanding) AA solution to make it look acceptable. Even the best art styles won't look good with low res + no AA (MK8). Number of NPCs is important too, a 30fps sandbox full of life is better than a 60fps desert. Going for more simple AIs isn't easy too: nobody likes dumb enemies.

captain carot said:
About that 1080p30=720p60:
That is only correct on GPU level. Mainly because most games today are limited by shaderpower graphics wise.

On the CPU level that is wrong, and CPU's still do a lot of work. Don't know why people forget that nowadays


That's not correct on GPU level. Modern GPUs are complex and a game can be bottlenecked by more than one thing, so decreasing resolution won't necessariluy give you a direct performance increase always.

About CPU, I agree. If it's CPU limited, resolution won't matter.



sc94597 said:

Like it has been said in this thread. Every single Zelda game, except for ALBW has ran at 30fps. Furthermore, Windwaker HD ran at 30 fps. Even if it is possible for Nintendo to run the game at 60 fps with its current graphical fidelity, they'd likely spend their time elsewhere (such as AA) for a locked 30 fps. 


That's not true. FSA on the GCN ran at 60fps, OoT ran at 20fps, and MM ran ar 20fps. WW was the first 3D Zelda to run at 30fps and that was locked, which is why WWHD couldn't run higher or lower.



sc94597 said:

Consoles don't support v-sync, from what I recall. You'll have a lot of screen tearing if you run the game on a 30hz/60hz display at anything that isn't an even or half multiple of 60. Basically the 60hz means the screen refreshes 60 times/second. If you have an fps locked at 30 you'll have a frame for every other refresh. Or if your television supports 30hz, you'll have a frame for every refresh. Now if you have say a framerate of 40 fps, every so often you'll have an extra frame that occurs in between two others (rather than there being a frame every other refresh.) This leads to tearing of the image where your eye sees part of the previous and next frame. There are techniques that can accomodate for this in PC gaming, since they can't really lock framerates for all PC's. These are called v-sync, g-sync, etc. For consoles, however, if a game is 40 fps they'll have to accomodate for this tearing some other way, and in many cases they might not be able to without expending more resources which would bring down their framerate to 30fps anyway. So usually developers either lock their game at 30 fps or 60 fps. And if it is 30 fps they provide more graphical features.

 

Just a correction, they do. Killzone Shadow Fall and InFamous: SS are just two examples of games using adaptive V-sync because of their flutuating framerate (40-50 fps). Syncing frames is just software, you can do it even on Android if you want.

About fps locks, a lot of PC games do just that when you turn V-sync on lock at 60 or 30 and call it a day. It's a sensible solution since flutuating framerates won't bring just tearing, but stuttering too and that's way more annoying.



BossPuma said:
baloofarsan said:

It is possible to limit the frames of ambient objects and its not really noticable at long distances, so yes. Dark Souls 2 did this for enemies far away where their animations were around 15 fps. I would expect  they would do this in Zelda U as part of their optimisation because the world is so huge


Thank you! This has been a glorious evening as my understanding of the 60/30 fps debate has widely improved.