By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Kyros said:
I got the impression you thought 30 fps was the be-all-end-all framerate


No. I don't want to wade into the human physiology war. All I say i that when I optimize PC games for my a little bit older computer I generally use 1024*768 and push every setting up as long as I have >20 fps in heavy settings and more during normal gameplay. That's the best use of limited resources for me. And since consoles are per se also a fixed resource I imagine developers make similar adjustments.
But I agree that this is different for some games. RPGs like Oblivion can stomach less fps, hectic games like spacesims need more fps.

But to say developers are lazy because they dont use 60fps is a bit simplistic. Its simply a question of priorities. And I still think the interesting question is not 30FPS or 60FPS but what the game does in extreme situations (like whole city views in Assassin's, lots of enemies on the screen in FPS ...)

Yes, as with everything it's a matter of making the appropriate trade-offs.

Regarding your remark about what games do in extreme situations, that's actually linked to the 30/60 fps thing. As you said, PC games often run at full speed, frame-rate varying with time. In console games, framerate is often locked to a maximum 30 or 60 fps (depending on the game), and developers then make the remaining decisions in order to fulfill that framerate at all times without slowdown. I suspect one of the biggest reasons for this is that consoles have fixed and known specs, unlike PCs. That means developers can plan in advance and say "in this game, we're shooting for a stable X frames per second", something they can't easily do on PC games. Another reason is that unlike PC monitors, most TVs don't go higher than 60 fps, so there's no reason to enable higher frame-rates.

 



My Mario Kart Wii friend code: 2707-1866-0957