By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - When Will the Frame Rate Standard Go Beyond 60fps?

spemanig said:
Tachikoma said:

So long as 4K is being dangled before developers likr the proverbial carrot, and the next big thing to slap on a game box before your competitors, you can say goodbye to 60fps let alone 120.

Personal note: those tv's they have in electronics stores with a sports match playing, and showing off the tv's 200+hz "fluid motion" crap, I always find those to be jarringly weird, even though it isn't really possible they feel like they're running smoother than real life, its hard to explain.


I think it has something to do with the difference between the way cameras capture footage and the way our brains effect how we percieve the world. High framerate cameras, as far as I can tell, don't capture or add motion blur. As far as I am aware, that's something our brains naturally add to fast moving objects in the world to help us comprehend them better, but since that's not the way high framerate video is displayed at all, its just flashing lights, our brains can't do that motion blur fill in, making everything look weird. Basically, if we saw the world the way these cameras capture the world, when you waved your hand in front of your face, instead of seeing a big blurry mess of skin, you'd see the very detailed and accurate movement of your hand rapidly moving back and forth with substantially less to completely nonexistant blur.

In a way, it sort of is smoother than in real life, because our brains can't cheat for us.

Oh! That explains why I like 60 fps in video games (where they can add motion blur) but it makes me motion sick with real life footage. I always get a really wierd uncanny-valley feeling when I watch streamers that stream in 60fps and I didn't like the 48fps of the Hobbit much either. Although I think I'm slowly getting used to it.



Around the Network

I'm sorry but i have to say that it’s useless… I really don’t understand your excitement for this.



I don't know, maybe 2020 ? 



SuperNova said:
spemanig said:
Tachikoma said:

So long as 4K is being dangled before developers likr the proverbial carrot, and the next big thing to slap on a game box before your competitors, you can say goodbye to 60fps let alone 120.

Personal note: those tv's they have in electronics stores with a sports match playing, and showing off the tv's 200+hz "fluid motion" crap, I always find those to be jarringly weird, even though it isn't really possible they feel like they're running smoother than real life, its hard to explain.


I think it has something to do with the difference between the way cameras capture footage and the way our brains effect how we percieve the world. High framerate cameras, as far as I can tell, don't capture or add motion blur. As far as I am aware, that's something our brains naturally add to fast moving objects in the world to help us comprehend them better, but since that's not the way high framerate video is displayed at all, its just flashing lights, our brains can't do that motion blur fill in, making everything look weird. Basically, if we saw the world the way these cameras capture the world, when you waved your hand in front of your face, instead of seeing a big blurry mess of skin, you'd see the very detailed and accurate movement of your hand rapidly moving back and forth with substantially less to completely nonexistant blur.

In a way, it sort of is smoother than in real life, because our brains can't cheat for us.

Oh! That explains why I like 60 fps in video games (where they can add motion blur) but it makes me motion sick with real life footage. I always get a really wierd uncanny-valley feeling when I watch streamers that stream in 60fps and I didn't like the 48fps of the Hobbit much either. Although I think I'm slowly getting used to it.

Do you have that problem with tv too? 1080i broadcast has been used next to 720p. Some shows do have 60i capture. Most are 30 or even 24fps though.
How about games before motion blur? A lot ran at 60fps, on PC all. Motion blur is just a recent thing.

True, on a monitor it looks more like a game than on tv
https://www.youtube.com/watch?v=GJMYJzaKCq4
Dr Who does look more fake than the usual 30fps show. Lighting requirements are different at 60fps, and since it looks more 'real' props and make-up have to step up too. There are also less gaps for your brain to fill in detail. Star trek original series looked pretty real on a crappy b&w analog tv, the enhanced 1080p version is still charming yet it looks like a high school play.

Some motion sickness is also due to how the camera works
https://www.youtube.com/watch?v=0wi5aNTHUVk
Especially at the start you see that the camera capture is too slow for 60fps and the picture constantly deforms. That's a problem with digital camera's that don't take an instant picture, yet continually scan from top to bottom (or bottom to top) When you snap a picture out of a fast moving car with your phone the picture usually seems to bend in the direction you're travelling.
Plus giving your eyes better motion clues at 60fps is of course the first cause of motion sickness. The conflict between your inner ear and visual clues only becomes stronger. Yet at 2:42 the camera is stationary, is that still uncomfortable to watch?

Here's another comparison
https://www.youtube.com/watch?v=ChsT-y7Yvkk
The Avatar footage looks better than ever imo. Looked fake anyway, might as well look smooth and fake :)



aLkaLiNE said:

Not entirely sure what you guys are talking about, 120hz smart tvs have been a thing for years and we even have a select few that run at 240hz. This doesn't even take into consideration plasma TVs - that operate at an insane 500hz.

Personally I would not buy a modern day TV if it only had a 60hz rating. That's something I consider a big deal and it directly impacts the lag between what the console outputs and what you're actually seeing.

120hz tvs are at best 20% of the tvs sold currently. Game devs are not going to make the jump over 60 FPS until more people have 120hz tvs than 60hz (which is going to be a LONG time)



Around the Network

the reality is there comes a point where the human eye does not really notice faster frame rate that much. sure, something competitive or super detail sensitive in terms of quick action and response is one thing but the average game it just isn't that important

its sort of like why people really enjoy films with 24 frames or whatever and have for a veryyyyy long time. there is only SO much detail the human eye will really notice at one time anyway

 

I'm not against updating it but across the board if its expensive I just don't think an insanely high FPS is that important for casual gaming when half the time gamers are doing something else or eating haha

at this points its irrational anyway as the vast majority of PC gamers do not have moniters that support a higher than 60FPS frame rate



rolltide101x said:
aLkaLiNE said:

Not entirely sure what you guys are talking about, 120hz smart tvs have been a thing for years and we even have a select few that run at 240hz. This doesn't even take into consideration plasma TVs - that operate at an insane 500hz.

Personally I would not buy a modern day TV if it only had a 60hz rating. That's something I consider a big deal and it directly impacts the lag between what the console outputs and what you're actually seeing.

120hz tvs are at best 20% of the tvs sold currently. Game devs are not going to make the jump over 60 FPS until more people have 120hz tvs than 60hz (which is going to be a LONG time)


I think you're overestimating here. For instance 4k became a thing like what, two years ago? The launch price of such TVs were astronomical, north of 5 grand. Today I can walk into a store and buy a smart 4k 48" TV for about 800$.

 

With refresh rates, 120hz has been available on HDTVs for at least five years and slowly but surely 60hz has been getting passed out over time. Now instead of 120hz being available on high end TVs, that number is now 240hz and I'm not positive about this but I believe I've seen super premium models sporting 480hz (that were oled, not plasma). With that being said it's only a matter of time before they stop making 60hz TVs all together. 

 

And what I meant in my first comment about input lag was not completely accurate - simply put, if you have a smart TV say at 60hz, yes it is refreshing the screen at 60fps. However most if not all smart tvs have display engines built into the software that, for instance can make colors brighter, can make contrast better, etc - this takes a specific amount of time to process before its displayed on the screen. The input lag I'm talking about is more of a processing delay which is why on newer sets, you can feel a difference when turning on 'gaming' mode. It's bypassing the software engine to get rid of this delay. By investing in TVs that have a higher refresh rate, more often than not the effect of 'gaming' mode diminishes greatly and you can keep all those fancy filters and what not processing while observing much less delay. You want the weakest point in the setup to be the console, not the sound system, not the monitor, but the console. This way no matter how fast your console or PC can perform, everything else has no problem keeping up with it. Does that make sense? I'm writing this from a cell phone lol.

 

 

 

Edit - I totally went off on a tangent and described something you weren't talking about :p the point is that 60hz TVs are being phased out and as 4k is the next big thing, I'm willing to bet that refresh rates on the TV side of things will bump up along with it. 



SvennoJ said:
spemanig said:
Tachikoma said:

So long as 4K is being dangled before developers likr the proverbial carrot, and the next big thing to slap on a game box before your competitors, you can say goodbye to 60fps let alone 120.

Personal note: those tv's they have in electronics stores with a sports match playing, and showing off the tv's 200+hz "fluid motion" crap, I always find those to be jarringly weird, even though it isn't really possible they feel like they're running smoother than real life, its hard to explain.


I think it has something to do with the difference between the way cameras capture footage and the way our brains effect how we percieve the world. High framerate cameras, as far as I can tell, don't capture or add motion blur. As far as I am aware, that's something our brains naturally add to fast moving objects in the world to help us comprehend them better, but since that's not the way high framerate video is displayed at all, its just flashing lights, our brains can't do that motion blur fill in, making everything look weird. Basically, if we saw the world the way these cameras capture the world, when you waved your hand in front of your face, instead of seeing a big blurry mess of skin, you'd see the very detailed and accurate movement of your hand rapidly moving back and forth with substantially less to completely nonexistant blur.

In a way, it sort of is smoother than in real life, because our brains can't cheat for us.

I think you've got that backwards. Any motion blur our eyes add they can add just as well to moving objects on a screen. It's how far objects move through your visual field while the light is being collected on you retina that cause blur. You don't have motion blur on non rotating objects that you follow with you eyes. Which is probably why 120hz/240hz smooth motion looks weird. The motion blur is in the original footage and gets smeared out over multiple frames. Now it runs at 120 or even 240 fps yet when you follow the ball with your eyes, it is still blurred as captured by the original 60 fps camera. Maybe that even interferes with your motion perception that reacts to clear lines and shapes for movement. Instead of your brain seeing a blurry object 'step' through your visual field and interfering motion from that, now it can't see the steps any more and it becomes a blur smoothly moving along.

I don't think there's a difference between waving your hand in front of you and watching a high fps recording of waving your hand in front of you. Depending on the light source of course, waving your hand in front of a CRT monitor was an easy test for flicker :)

It probably mostly has to do with adjusting to 120 fps recordings. From birth your brain has learned how to deal with 24, 30 and 60fps. 120fps is new and apparently still not high enough to remove the distinction between looking out a window and looking at a 120fps screen. Back before LCD I used to game at upto 100hz (CRT doesn't really become flicker free below 100) I didn't notice anything weird about it. That was before motion blur was added to games.


Ah. Thank you for clarifying!



Is there a need for a frame rate that goes beyond 60fps? That's what I want to know.



                
       ---Member of the official Squeezol Fanclub---

When VR become the standard? No time in the forseeable in future.