Otter said: I'm sure the game can hit 60 when you stare at an empty sky. |
In fairness, part of the new sensitivity to different frame rates is the advancements in TV technology, mostly low persistence. 30fps on a 120hz low persistence panel tends to look a lot more like a slide show than 30fps on a CRT tv/monitor, or early 60hz LCD tv. Before progressive scan, 60fps could actually look worse in fast motion than 30fps. Interlacing breaking the image apart.
Also the higher the resolution and bigger the display, the more the 'steps' become visible. On a small screen it doesn't matter that much, on a 4K 65" screen you will see the 'steps' a lot more clearly. For 24fps "The rule of thumb is to pan no faster than a full image width every seven seconds, otherwise judder will become too detrimental." That's a pretty slow turn rate! But necessary for cinema where the screen is wall to wall.
I never had issues playing 30fps games on a 1080p projector, actually still preferred it over 60fps. 60fps doesn't take the judder away, turn fast and you still get lots of 'double images'. The steps are smaller yet there are twice as many at 60fps vs 30fps. Ideally everything only moves 1 pixel between frames, yet that means the frame rate of each object depends on its speed across the screen. Maybe one day that will be possible, a Dolby Atmos equivalent for graphics. For now we're still stuck to one frame at a time (with some things actually running at half or quarter frame rate)