Michael-5 said:
That's a crude way to explain it, but it's kind of true, and this is why people think hey can see 60FPS. At 30FPS, for a digital signal (say progressive scan - p), humans can still detect the change in screens. At 60FPS humans can't see that ransition. However at 60FPS, humans will only see half the frames in a given second. People should look into my example of a car rim on te highway. When a car is accelerating, you first see it move clockwise (if you are to the right of the car), then it starts to skip, then it starts to go backward slowly, then it skips again, then it goes backward really fast. Where it goes backward slowly is about 30FPS because every 30th of a second, the next spoke is just behind the prior one a 30th of a second ago. There is a lot more to this, but I want people to understand that basic point. 60FPS is smoother then 30FPS, but not beause we can see at 30FPS, but because we see the transition of screen (when it's not a factor of 30, and under 60FPS), the drop in FPS, and screen tearing. |
Interesting, you say that my point is "kind of true", and then spend the rest of your point claiming the exact opposite of what I said. Ok, lets start putting in some references.
Humans can see flicker at anything less than 70-100 fps [1]. This can be counteracted with deliberately adding motion blur, but that has its own disadvantages.
Humans can see bright flashes of light lasting as little as 1/220th of a second in the centre of their vision, and shorter pulses in their peripheral vision[1]. This requires 220 fps to display properly. You might argue that this probably isn't relevant for video games, I'm just trying to teach you how wrong the 'human vision is 30 fps' thing is. In all honesty though, this site does a better job of explaining it than I do http://amo.net/NT/02-21-01FPS.html, go, read, learn!
[1] http://www.100fps.com/how_many_frames_can_humans_see.htm







