By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Michael-5 said:
Norris2k said:
Michael-5 said:
 

 

2. Above 24 FPS, a human eye can't detect any changes to frame rate (first year chem/bio). You will see a smoother image, but never any screen tear or lag. So above 24 FPS and the difference is marginal. PS3 games lock in either 30 or 60 FPS, XB1 does the same, and so does WiiU. Neither is performing better here, all are doing well.

There is nothing like that in the human eye or brain. Light signal is continuous and there is no notion of FPS in the eye or brain. 24 FPS is just a convenient and very old standard to produce with trick and limits a non interactive, good enough looking movie:

1 - Motion blur in the picture trick the eye to see the animation smoother than it is. Lag would be painfully noticeable if you were to watch 24 fps without blur. Don't expect a 3D game to be as good as the real motion blur from time of light exposition on a real thing.

2 - On interactive game, if you click on a button and the action display in 1/30th or 1/60 you will notice. In a movie you don't have any reference to assume a picture is late or not. In a game you compare to your own input.

3 -  You need only 1 fps to display an unmoving wall, but the faster the movement, the more fps you need to make it smooth. That's why fast camera travellings in movies doesn't look so great.

4 - Even for a non interactive motion blurred movie and no super fast travelling... go to 48 fps and you will notice it a lot. In fact it will fill a little wierd, not like a movie. It will feel more real.

I dunno about that, I took a chemistry course in school and one of the exam questions was to calculate the FPS human eyes can see at. Then do the same for wolves. Humans do make a distintion at 24FPS. We can see above 24FPS, but the difference is marginal.

Below 24 FPS, the screen flickers. (For wolves & dogs it's 60FPS, that's why they rarely watch tv, it's always flickering)

Flickering is an other and even worst problem. You would so much notice it at 24 FPS that any TV refresh at least 50 or 60 times a second and it's still visible. That's what  is 50Hz or 60hz on a television. On LCD it became better, but on CRT for your eye not being tired by flickering required 80 to 120hz. I think video projectors are at only 48hz with no visible flickering, but the room is dark. It really depends on technology, conditions, anyway there is no "24 fps" absolute value.

I think what you got as a course was just an over simplification or even a mistake. http://en.wikipedia.org/wiki/Flicker_%28screen%29