By using this site, you agree to our Privacy Policy and our Terms of Use. Close
spemanig said:
Tachikoma said:

So long as 4K is being dangled before developers likr the proverbial carrot, and the next big thing to slap on a game box before your competitors, you can say goodbye to 60fps let alone 120.

Personal note: those tv's they have in electronics stores with a sports match playing, and showing off the tv's 200+hz "fluid motion" crap, I always find those to be jarringly weird, even though it isn't really possible they feel like they're running smoother than real life, its hard to explain.


I think it has something to do with the difference between the way cameras capture footage and the way our brains effect how we percieve the world. High framerate cameras, as far as I can tell, don't capture or add motion blur. As far as I am aware, that's something our brains naturally add to fast moving objects in the world to help us comprehend them better, but since that's not the way high framerate video is displayed at all, its just flashing lights, our brains can't do that motion blur fill in, making everything look weird. Basically, if we saw the world the way these cameras capture the world, when you waved your hand in front of your face, instead of seeing a big blurry mess of skin, you'd see the very detailed and accurate movement of your hand rapidly moving back and forth with substantially less to completely nonexistant blur.

In a way, it sort of is smoother than in real life, because our brains can't cheat for us.

I think you've got that backwards. Any motion blur our eyes add they can add just as well to moving objects on a screen. It's how far objects move through your visual field while the light is being collected on you retina that cause blur. You don't have motion blur on non rotating objects that you follow with you eyes. Which is probably why 120hz/240hz smooth motion looks weird. The motion blur is in the original footage and gets smeared out over multiple frames. Now it runs at 120 or even 240 fps yet when you follow the ball with your eyes, it is still blurred as captured by the original 60 fps camera. Maybe that even interferes with your motion perception that reacts to clear lines and shapes for movement. Instead of your brain seeing a blurry object 'step' through your visual field and interfering motion from that, now it can't see the steps any more and it becomes a blur smoothly moving along.

I don't think there's a difference between waving your hand in front of you and watching a high fps recording of waving your hand in front of you. Depending on the light source of course, waving your hand in front of a CRT monitor was an easy test for flicker :)

It probably mostly has to do with adjusting to 120 fps recordings. From birth your brain has learned how to deal with 24, 30 and 60fps. 120fps is new and apparently still not high enough to remove the distinction between looking out a window and looking at a 120fps screen. Back before LCD I used to game at upto 100hz (CRT doesn't really become flicker free below 100) I didn't notice anything weird about it. That was before motion blur was added to games.