Quantcast
When Will the Frame Rate Standard Go Beyond 60fps?

Forums - Gaming Discussion - When Will the Frame Rate Standard Go Beyond 60fps?

spemanig said:
Tachikoma said:

So long as 4K is being dangled before developers likr the proverbial carrot, and the next big thing to slap on a game box before your competitors, you can say goodbye to 60fps let alone 120.

Personal note: those tv's they have in electronics stores with a sports match playing, and showing off the tv's 200+hz "fluid motion" crap, I always find those to be jarringly weird, even though it isn't really possible they feel like they're running smoother than real life, its hard to explain.


I think it has something to do with the difference between the way cameras capture footage and the way our brains effect how we percieve the world. High framerate cameras, as far as I can tell, don't capture or add motion blur. As far as I am aware, that's something our brains naturally add to fast moving objects in the world to help us comprehend them better, but since that's not the way high framerate video is displayed at all, its just flashing lights, our brains can't do that motion blur fill in, making everything look weird. Basically, if we saw the world the way these cameras capture the world, when you waved your hand in front of your face, instead of seeing a big blurry mess of skin, you'd see the very detailed and accurate movement of your hand rapidly moving back and forth with substantially less to completely nonexistant blur.

In a way, it sort of is smoother than in real life, because our brains can't cheat for us.

I think you've got that backwards. Any motion blur our eyes add they can add just as well to moving objects on a screen. It's how far objects move through your visual field while the light is being collected on you retina that cause blur. You don't have motion blur on non rotating objects that you follow with you eyes. Which is probably why 120hz/240hz smooth motion looks weird. The motion blur is in the original footage and gets smeared out over multiple frames. Now it runs at 120 or even 240 fps yet when you follow the ball with your eyes, it is still blurred as captured by the original 60 fps camera. Maybe that even interferes with your motion perception that reacts to clear lines and shapes for movement. Instead of your brain seeing a blurry object 'step' through your visual field and interfering motion from that, now it can't see the steps any more and it becomes a blur smoothly moving along.

I don't think there's a difference between waving your hand in front of you and watching a high fps recording of waving your hand in front of you. Depending on the light source of course, waving your hand in front of a CRT monitor was an easy test for flicker :)

It probably mostly has to do with adjusting to 120 fps recordings. From birth your brain has learned how to deal with 24, 30 and 60fps. 120fps is new and apparently still not high enough to remove the distinction between looking out a window and looking at a 120fps screen. Back before LCD I used to game at upto 100hz (CRT doesn't really become flicker free below 100) I didn't notice anything weird about it. That was before motion blur was added to games.



Around the Network
SpokenTruth said:
Lawlight said:
SpokenTruth said:
Lawlight said:

I'm not a PC gamer

That said, I insist on the fact that graphics have stagnated. It's apparent just by looking at the games themselves.

Seriously?   You're going to claim they look the same without actually playing any of them?

I suppose the physics look the same too? 

I don't need to play a game to see how it looks. Yeah, what are the games that have changed physics so much these past few years?



Lawlight said:
SpokenTruth said:
Lawlight said:

That said, I insist on the fact that graphics have stagnated. It's apparent just by looking at the games themselves.

Seriously?   You're going to claim they look the same without actually playing any of them?

I suppose the physics look the same too? 

I don't need to play a game to see how it looks. Yeah, what are the games that have changed physics so much these past few years?

Yeah, you kind of need to actually play them to understand the difference between a game today and a game several years ago.

Better stated, has the minimum and suggested requirements for PC games stayed the same over the past 3 years?  No, they haven't.  Given that fact, please enlighten us how with just a glance at a game (and never playing any) you can account for the increase in requirements?



Massimus - "Trump already has democrat support."

Finally, I have been asking this question on youtube to people all the time. I am just intersted in the progress of the industry. 

I think for me faster games like Destiny should be at a higher framerate because their animations are complicated or characters travel distances quickly. It being only 30 fps was a major flaw to me. I read some youtuber joke that it should be 120 fps with all the crap on the screen and jump mechanics. Although the old Halo games never bothered me at 30 fps and I don't even feel much of a difference with them being 60 fps.



I just want a FIXED framerate that's good.



 

-----------------------------------------------------------------------------------

12/22/2016- Made a bet with Ganoncrotch that the first 6 months of 2017 will be worse than 2016. A poll will be made to determine the winner. Loser has to take a picture of them imitating their profile picture.

Around the Network
SpokenTruth said:
Lawlight said:
SpokenTruth said:
Lawlight said:

That said, I insist on the fact that graphics have stagnated. It's apparent just by looking at the games themselves.

Seriously?   You're going to claim they look the same without actually playing any of them?

I suppose the physics look the same too? 

I don't need to play a game to see how it looks. Yeah, what are the games that have changed physics so much these past few years?

Yeah, you kind of need to actually play them to understand the difference between a game today and a game several years ago.

Better stated, has the minimum and suggested requirements for PC games stayed the same over the past 3 years?  No, they haven't.  Given that fact, please enlighten us how with just a glance at a game (and never playing any) you can account for the increase in requirements?


Minimum requirements don't mean anything as it doesn't take into consideration optimization, which would explain why some games that came out years before are more demanding than games released now.

And if you can't show me the difference in screenshots, then graphics haven't gotten noticeably better.



Cheeky answer:
When you buy a gaming pc.



spemanig said:
Tachikoma said:

So long as 4K is being dangled before developers likr the proverbial carrot, and the next big thing to slap on a game box before your competitors, you can say goodbye to 60fps let alone 120.

Personal note: those tv's they have in electronics stores with a sports match playing, and showing off the tv's 200+hz "fluid motion" crap, I always find those to be jarringly weird, even though it isn't really possible they feel like they're running smoother than real life, its hard to explain.


I think it has something to do with the difference between the way cameras capture footage and the way our brains effect how we percieve the world. High framerate cameras, as far as I can tell, don't capture or add motion blur. As far as I am aware, that's something our brains naturally add to fast moving objects in the world to help us comprehend them better, but since that's not the way high framerate video is displayed at all, its just flashing lights, our brains can't do that motion blur fill in, making everything look weird. Basically, if we saw the world the way these cameras capture the world, when you waved your hand in front of your face, instead of seeing a big blurry mess of skin, you'd see the very detailed and accurate movement of your hand rapidly moving back and forth with substantially less to completely nonexistant blur.

In a way, it sort of is smoother than in real life, because our brains can't cheat for us.

Oh! That explains why I like 60 fps in video games (where they can add motion blur) but it makes me motion sick with real life footage. I always get a really wierd uncanny-valley feeling when I watch streamers that stream in 60fps and I didn't like the 48fps of the Hobbit much either. Although I think I'm slowly getting used to it.



I'm sorry but i have to say that it’s useless… I really don’t understand your excitement for this.



I don't know, maybe 2020 ?