| twesterm said: Going above 30 on TV there's a slight difference, but there's no point in it being that the trade off totally isn't worth it. Most games aren't going to notice the difference between a game that runs at 30fps on a TV and another that runs at 60fps on a TV. |
This isn't really true, though it depends on the TV.
Back in the early 90's Sega and other big japanese developers did a lot of beta testing in arcades, and their findings was that 60fps consistently made more money than 30fps despite the inferior graphics. Since then Sega has never made an arcade racer that doesn't run at a constant 60fps and likely never will. It's interesting to note that most of their console racers only runs at 30fps though...
It seems that whenever a developer wants to just *sell* you a complete game, they increase graphics quality at the expense of the framerate. Whenever their income depend on you actually *playing* the game (arcades), they drop graphics quality and increase the framerate. Nintendo tend to be an exception. Much as I dislike the Wii, I have to give the company some credit for the fact that most of their first party games runs at 60fps.
You can do a test yourself. Try playing a 60fps racer such as Forza 2 for a long on time on a high quality CRT TV, then switch to a 30fps racer such as Dirt. Or play Devil May Cry 4, then switch to Viking. I think you'll be shocked at the difference.
The TV does make a difference though. A low framerate is more noticeable on CRT's than Plasma/LCD, as CRT is the only screen technology that completely redraws each frame. On flat screens each frame just morph into the next frame, creating a slight motion blur effect that help cover up low framerates. And most LCD's doesn't have response times fast enough to correctly display 60fps, in fact some LCD's can't even display 30fps correctly.







