By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Immersiveunreality said:
KiigelHeart said:

Ah the "30fps is more cinematic and 60fps is unnatural" argument. I guess it's still alive then lol. I think it's impossible not to see a major difference between the two. 

Yeah it's weird cause the more framerate the more natural you see things.

This is a nice quote:

''Myelinated nerves can fire between 300 to 1000 times per second in the human body and transmit information at 200 miles per hour. What matters here is how frequently these nerves can fire (or "send messages").

The nerves in your eye are not exempt from this limit. Your eyes can physiologically transmit data that quickly and your eyes/brain working together can interpret up to 1000 frames per second.

However, we know from experimenting (as well as simple anecdotal experience) that there is a diminishing return in what frames per second people are able to identify. Although the human eye and brain can interpret up to 1000 frames per second, someone sitting in a chair and actively guessing at how high a framerate is can, on average, interpet up to about 150 frames per second.

The point: 60 fps is not a 'waste'. 120 fps is not a 'waste' (provided you have a 120hz monitor capable of such display). There IS a very noticable difference between 15 fps and 60 fps. Many will say there IS a noticeable difference between 40 and 60 fps. Lastly, the limit of the human eye is NOT as low as 30-60 fps. It's just not.

The origin of the myth: The origin of the myth probably has to do with limitations of television and movies. Movies, when they were recorded on film reel, limited themselves to 24 frames per second for practical purposes. If there is a diminishing return in how many frames people can claim to actually notice, then the visual difference between 24 fps and 60 fps could not justify DOUBLING the amount of film reel required to film a movie.

With the advent of easy digital storage, these limitations are mostly arbitrary anymore.

The numbers often cited as the mythological "maximum" the eye can see are 30 fps, 40 fps, and 60 fps.

I would guess the 60 fps "eye-seeing" limit comes from the fact that most PC monitors (and indeed many televisions now) have a maximum refresh rate of 60hz (or 60 frames per second). If a monitor has that 60 fps limit, the monitor is physically incapable of displaying more than 60 fps. This is one of the purposes of frame limiting, Vsync and adjusting refresh rate in video games.

tl;dr: The human eye can physiologically detect up to 1000 frames per second. The average human, tasked with detecting what framerate he/she is looking at, can accurately guess up to around 150 fps. That is, they can see the difference in framerates all the way to 150 fps.''

''The USAF, in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to "see" the afterimage as well as identify the aircraft. This simple and specific situation not only proves the ability to percieve 1 image within 1/220 of a second, but the ability to interpret higher FPS.''

I think it's a matter of fitting the animation speed to the frame rate. Animations originally made for 30fps seem to be "too fast" in 60fps so the human movement seems less natural. It's not as bad all the time, but sometimes it shows. Kind of like watching a movie where the characters are moving at 1.25x or 1.5x speed.