By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - When Will the Frame Rate Standard Go Beyond 60fps?

AZWification said:

Is there a need for a frame rate that goes beyond 60fps? That's what I want to know.

Depends how fast you want to pan the image.
https://www.youtube.com/watch?v=ChsT-y7Yvkk
Check at 0:16, 60 fps is not fast enough to keep the background stable. It still looks like multiple images instead of a moving background you can lock your eyes onto and follow. The real upper limit is when the image only shifts 1 pixel per frame. Only that way do you get a perfect image when you follow something across the screen with your eyes. How much you need that is debatable, yet that's the only way to make it indistinguisable from real life. (Granted the resolution is also higher than you eyes can resolve)



Around the Network
Tachikoma said:

So long as 4K is being dangled before developers likr the proverbial carrot, and the next big thing to slap on a game box before your competitors, you can say goodbye to 60fps let alone 120.

Personal note: those tv's they have in electronics stores with a sports match playing, and showing off the tv's 200+hz "fluid motion" crap, I always find those to be jarringly weird, even though it isn't really possible they feel like they're running smoother than real life, its hard to explain.


I dare say most developers with copacity to produce top class graphics are smart enough to not strive for 4k. The compromises simply won't be worth it from a graphical or performance standpoint.  As far as marketing is concerned, the game thats looks like a CG movie will get more traction then the game that looks slightly better then a PS4 title but running at 4k. I think Devs will stop at 60fps, up unless its necessary to jump higher (VR titles) or until we reach the point they have little else to throw power at (A very distant future)



aLkaLiNE said:


I think you're overestimating here. For instance 4k became a thing like what, two years ago? The launch price of such TVs were astronomical, north of 5 grand. Today I can walk into a store and buy a smart 4k 48" TV for about 800$.

 

With refresh rates, 120hz has been available on HDTVs for at least five years and slowly but surely 60hz has been getting passed out over time. Now instead of 120hz being available on high end TVs, that number is now 240hz and I'm not positive about this but I believe I've seen super premium models sporting 480hz (that were oled, not plasma). With that being said it's only a matter of time before they stop making 60hz TVs all together. 

 

And what I meant in my first comment about input lag was not completely accurate - simply put, if you have a smart TV say at 60hz, yes it is refreshing the screen at 60fps. However most if not all smart tvs have display engines built into the software that, for instance can make colors brighter, can make contrast better, etc - this takes a specific amount of time to process before its displayed on the screen. The input lag I'm talking about is more of a processing delay which is why on newer sets, you can feel a difference when turning on 'gaming' mode. It's bypassing the software engine to get rid of this delay. By investing in TVs that have a higher refresh rate, more often than not the effect of 'gaming' mode diminishes greatly and you can keep all those fancy filters and what not processing while observing much less delay. You want the weakest point in the setup to be the console, not the sound system, not the monitor, but the console. This way no matter how fast your console or PC can perform, everything else has no problem keeping up with it. Does that make sense? I'm writing this from a cell phone lol.

 

 

 

Edit - I totally went off on a tangent and described something you weren't talking about :p the point is that 60hz TVs are being phased out and as 4k is the next big thing, I'm willing to bet that refresh rates on the TV side of things will bump up along with it. 

TV and PC refresh rates cannot be compared directly, usually. TV's operate still very much at 60Hz eventhough being marketed at 100/120/240Hz whatever...meaning most TVs accept only 60Hz source. There are very few true 120Hz TVs out there. With PC monitors it's different, if it is 120Hz monitor it means it accepts 120Hz source. In the context of this thread, your talk is very misleading.



I cannot imagine toilet-free life.

Kebabs have a unique attribute compared to other consumables. To unlock this effect you need to wolf down a big ass kebab really fast, like under 10 minutes or so and wait for the effect to kick in. If done correctly your movements should feel unbelievably heavy to the point where you literally cannot move at all.

-Downtown Alanya Kebab magazine issue no.198

WC4Life said:
aLkaLiNE said:


I think you're overestimating here. For instance 4k became a thing like what, two years ago? The launch price of such TVs were astronomical, north of 5 grand. Today I can walk into a store and buy a smart 4k 48" TV for about 800$.

 

With refresh rates, 120hz has been available on HDTVs for at least five years and slowly but surely 60hz has been getting passed out over time. Now instead of 120hz being available on high end TVs, that number is now 240hz and I'm not positive about this but I believe I've seen super premium models sporting 480hz (that were oled, not plasma). With that being said it's only a matter of time before they stop making 60hz TVs all together. 

 

And what I meant in my first comment about input lag was not completely accurate - simply put, if you have a smart TV say at 60hz, yes it is refreshing the screen at 60fps. However most if not all smart tvs have display engines built into the software that, for instance can make colors brighter, can make contrast better, etc - this takes a specific amount of time to process before its displayed on the screen. The input lag I'm talking about is more of a processing delay which is why on newer sets, you can feel a difference when turning on 'gaming' mode. It's bypassing the software engine to get rid of this delay. By investing in TVs that have a higher refresh rate, more often than not the effect of 'gaming' mode diminishes greatly and you can keep all those fancy filters and what not processing while observing much less delay. You want the weakest point in the setup to be the console, not the sound system, not the monitor, but the console. This way no matter how fast your console or PC can perform, everything else has no problem keeping up with it. Does that make sense? I'm writing this from a cell phone lol.

 

 

 

Edit - I totally went off on a tangent and described something you weren't talking about :p the point is that 60hz TVs are being phased out and as 4k is the next big thing, I'm willing to bet that refresh rates on the TV side of things will bump up along with it. 

TV and PC refresh rates cannot be compared directly, usually. TV's operate still very much at 60Hz eventhough being marketed at 100/120/240Hz whatever...meaning most TVs accept only 60Hz source. There are very few true 120Hz TVs out there. With PC monitors it's different, if it is 120Hz monitor it means it accepts 120Hz source. In the context of this thread, your talk is very misleading.

Misleading how? And if they can't be compared directly, usually, then under what circumstance can they be compared directly under? 

 

The only part that's misleading as far as HDTVanufacturers go is that giant "Clearmotion240" badge you see slapped on the corner of the box. Or "Motionflow XR480", and so on - each of the HDTV makers has their own way of labelling their smart tvs, bit they always label them in derivatives of 60. That's misleading because someone will see that 'clearmotion240' badge and think that it's talking about the refresh rate when the truth behind that is the refresh rate is a fraction of that (usually half) but the company in question is saying that their specific technology makes it appear to be as fluid or vivid or realistic as that number.

 

So let's say you have a Sony Smart TV that says 'Motionflow XR240' on it. The refresh rate will actually be 120. Same with Samsung, same with LG, same with every company in the smart TV game. Does your TV say 'clearmotion120' on the box? That means it has a 60hz refresh rate. The refresh rates themselves will still be stated on the box but you might have to go out of your way to find it. These companies are misleading but what I have posted definitely is not.



aLkaLiNE said:
WC4Life said:

TV and PC refresh rates cannot be compared directly, usually. TV's operate still very much at 60Hz eventhough being marketed at 100/120/240Hz whatever...meaning most TVs accept only 60Hz source. There are very few true 120Hz TVs out there. With PC monitors it's different, if it is 120Hz monitor it means it accepts 120Hz source. In the context of this thread, your talk is very misleading.

Misleading how? And if they can't be compared directly, usually, then under what circumstance can they be compared directly under? 

 

The only part that's misleading as far as HDTVanufacturers go is that giant "Clearmotion240" badge you see slapped on the corner of the box. Or "Motionflow XR480", and so on - each of the HDTV makers has their own way of labelling their smart tvs, bit they always label them in derivatives of 60. That's misleading because someone will see that 'clearmotion240' badge and think that it's talking about the refresh rate when the truth behind that is the refresh rate is a fraction of that (usually half) but the company in question is saying that their specific technology makes it appear to be as fluid or vivid or realistic as that number.

 

So let's say you have a Sony Smart TV that says 'Motionflow XR240' on it. The refresh rate will actually be 120. Same with Samsung, same with LG, same with every company in the smart TV game. Does your TV say 'clearmotion120' on the box? That means it has a 60hz refresh rate. The refresh rates themselves will still be stated on the box but you might have to go out of your way to find it. These companies are misleading but what I have posted definitely is not.


Going by the odds, TVs and monitors are never comparable because they are marketed differently. Of course, hard specs can be compared but that is not what you get with a quick look. You seem to be under the impression that it is a rule the refresh rate for TVs is half of what is being marketed and this is simply not true.

Anyways, what matters here is the context. Yes these TVs refresh their picture at 120Hz but you need to understand those TVs take 60Hz source signal and interpolates internally the other 60Hz which combined makes the 120Hz refresh rate. If developers made games run at 120fps, the majority of TVs would not be able to display them at true 120Hz at all (and that includes TVs sold at 2015), which would make the higher framerate a complete waste. Also the plasma "400Hz" refresh rate is bullshit, manufacturers wanted a bigger number but it is not actually comparable.

 

In short, the large majority of TVs do not have the capabilities of displaying a higher framerate than 60fps for games.



I cannot imagine toilet-free life.

Kebabs have a unique attribute compared to other consumables. To unlock this effect you need to wolf down a big ass kebab really fast, like under 10 minutes or so and wait for the effect to kick in. If done correctly your movements should feel unbelievably heavy to the point where you literally cannot move at all.

-Downtown Alanya Kebab magazine issue no.198

Around the Network
SvennoJ said:
SuperNova said:
spemanig said:
Tachikoma said:

So long as 4K is being dangled before developers likr the proverbial carrot, and the next big thing to slap on a game box before your competitors, you can say goodbye to 60fps let alone 120.

Personal note: those tv's they have in electronics stores with a sports match playing, and showing off the tv's 200+hz "fluid motion" crap, I always find those to be jarringly weird, even though it isn't really possible they feel like they're running smoother than real life, its hard to explain.


I think it has something to do with the difference between the way cameras capture footage and the way our brains effect how we percieve the world. High framerate cameras, as far as I can tell, don't capture or add motion blur. As far as I am aware, that's something our brains naturally add to fast moving objects in the world to help us comprehend them better, but since that's not the way high framerate video is displayed at all, its just flashing lights, our brains can't do that motion blur fill in, making everything look weird. Basically, if we saw the world the way these cameras capture the world, when you waved your hand in front of your face, instead of seeing a big blurry mess of skin, you'd see the very detailed and accurate movement of your hand rapidly moving back and forth with substantially less to completely nonexistant blur.

In a way, it sort of is smoother than in real life, because our brains can't cheat for us.

Oh! That explains why I like 60 fps in video games (where they can add motion blur) but it makes me motion sick with real life footage. I always get a really wierd uncanny-valley feeling when I watch streamers that stream in 60fps and I didn't like the 48fps of the Hobbit much either. Although I think I'm slowly getting used to it.

Do you have that problem with tv too? 1080i broadcast has been used next to 720p. Some shows do have 60i capture. Most are 30 or even 24fps though.
How about games before motion blur? A lot ran at 60fps, on PC all. Motion blur is just a recent thing.

True, on a monitor it looks more like a game than on tv
https://www.youtube.com/watch?v=GJMYJzaKCq4
Dr Who does look more fake than the usual 30fps show. Lighting requirements are different at 60fps, and since it looks more 'real' props and make-up have to step up too. There are also less gaps for your brain to fill in detail. Star trek original series looked pretty real on a crappy b&w analog tv, the enhanced 1080p version is still charming yet it looks like a high school play.

Some motion sickness is also due to how the camera works
https://www.youtube.com/watch?v=0wi5aNTHUVk
Especially at the start you see that the camera capture is too slow for 60fps and the picture constantly deforms. That's a problem with digital camera's that don't take an instant picture, yet continually scan from top to bottom (or bottom to top) When you snap a picture out of a fast moving car with your phone the picture usually seems to bend in the direction you're travelling.
Plus giving your eyes better motion clues at 60fps is of course the first cause of motion sickness. The conflict between your inner ear and visual clues only becomes stronger. Yet at 2:42 the camera is stationary, is that still uncomfortable to watch?

Here's another comparison
https://www.youtube.com/watch?v=ChsT-y7Yvkk
The Avatar footage looks better than ever imo. Looked fake anyway, might as well look smooth and fake :)


Thanks for your reply. It's an awesome post!

As for your questions, I don't tend to get it as much with normal Tv broadcasts. Sometimes I get  little motion sick, it happens very rarely though. I've never had it watching Dr. Who.

With 60 FPS games before motion blur, I'm not too sure. I've never noticed any troubles, so I guess it might be more than just the motion blur. Also, with games a lot depends on how the camera moves. One Allison Road demo video made me miserably motion sick, for example.

Hmm. So basically I become motion sick because 60 fps looks too close, to reality and my brain can't deal with the dissonance.

I do get motion sick with 60 fps on a stationary camera too though, so it not just limited to fast camera movement or fast action heavy scenes, wich is wierd I guess.

Avatar looks like a really beautiful game already, so I'm not as bothered.;P

Also I saw your reply to speaming after posting mine, so thanks for clarifying!



SuperNova said:


Thanks for your reply. It's an awesome post!

As for your questions, I don't tend to get it as much with normal Tv broadcasts. Sometimes I get  little motion sick, it happens very rarely though. I've never had it watching Dr. Who.

With 60 FPS games before motion blur, I'm not too sure. I've never noticed any troubles, so I guess it might be more than just the motion blur. Also, with games a lot depends on how the camera moves. One Allison Road demo video made me miserably motion sick, for example.

Hmm. So basically I become motion sick because 60 fps looks too close, to reality and my brain can't deal with the dissonance.

I do get motion sick with 60 fps on a stationary camera too though, so it not just limited to fast camera movement or fast action heavy scenes, wich is wierd I guess.

Avatar looks like a really beautiful game already, so I'm not as bothered.;P

Also I saw your reply to speaming after posting mine, so thanks for clarifying!

Yep, ironically hfr to make movies look closer to reality does the exact opposite. A lot of movies use shaky camera footage to make it more 'real'. It's not just the camera movement that does this, by making it harder to follow your brain does more work to fill in the gaps. A lot of action scenes would like quite tame with a stationary camera. Filming these at hfr makes them easier to follow, causing them to look more tame, next to giving a lot more people motion sickness.
It's the same as the most effective horror movies and games are those that show the least.

It depends on the movie of course. Animated movies at 60fps look great. There adding the realness factor of hfr does work. Of course it already takes months to render an animated movie at 24 fps. Quite costly to go to 60 fps.

Here is an awesome post about the 'realness' factor of higher frame rates.
http://accidentalscientist.com/2014/12/why-movies-look-weird-at-48fps-and-games-are-better-at-60fps-and-the-uncanny-valley.html
It goes into detail of how our eyes work. Basicially the human eye uses microtremmor for edge detection and thus motion detecion. Your eyes continuously wobble at about 83hz to sweep the light accross the receptor cells (about 3 of them on average).

Let’s assume that if (like real life) what you’re seeing is continuously changing, and noisy, your brain can pick out the sparse signal from the data very effectively. It can supersample (as we talked about above), and derive twice the data from it. In fact, the signal has to be noisy for the best results – we know that from a phenomenon known as Stochastic Resonance.

What’s more, if we accept that an oscillation of 83.68Hz allows us to perceive double the resolution, what happens if you show someone pictures that vary (like a movie, or a videogame) at less than half the rate of the oscillation?

We’re no longer receiving a signal that changes fast enough to allow the super-sampling operation to happen. So we’re throwing away a lot of perceived-motion data, and a lot of detail as well.

If it’s updating higher than half the rate of oscillation? As the eye wobbles around, it’ll sample more details, and can use that information to build up a better picture of the world. Even better if we’ve got a bit of film-grain noise in there (preferably via temporal anti-aliasing) to fill in the gaps.

It just so happens that half of 83.68Hz is about 41Hz. So if you’re going to have high-resolution pulled properly out of an image, that image needs to be noisy (like film-grain) and update at > 41Hz. Like, say, The Hobbit. Or any twitch-shooter.

Less than that? Say, 24fps? Or 30fps for a game? You’re below the limit. Your eye will sample the same image twice, and won’t be able to pull out any extra spatial information from the oscillation. Everything will appear a little dreamier, and lower resolution. (Or at least, you’ll be limited to the resolution of the media that is displaying the image, rather than some theoretical stochastic limit).

He also explains the usefulness to film grain in games and why 60fps can be more useful that doubling resolution for 3D games. Worth a read.




For consoles that will have to wait for vr (pretty soon for early adopters)... 60fps is the cap for them because this is how the standrd for tv refrsh rate is set..

 

on the pc side there has beenhigh refresh rate monitors for ages, LCD screens for some reason capped at at 60 early on and until now it has been fairly rare to have people with monitors with hugher refresh rates (yes there are 120, 144 and 165 hz monitors... But I have never seen one yet... I'd love to!).

 

anyway, back in the crt days I used to do everything at 85 to 100hz depending on the resolution... Lower refresh rates flickered and it wasn't pleasing (anything below 72hz has visible flickers on a phosphor monitor).



My question is: how long until "gamers" start preaching how they can SEE a HUGE difference between 60 fps and 120 fps and a game simply can't be played at JUST 60 fps? Then we'll know we've reached a whole new generation.