By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - When Will the Frame Rate Standard Go Beyond 60fps?

aLkaLiNE said:

Not entirely sure what you guys are talking about, 120hz smart tvs have been a thing for years and we even have a select few that run at 240hz. This doesn't even take into consideration plasma TVs - that operate at an insane 500hz.

Personally I would not buy a modern day TV if it only had a 60hz rating. That's something I consider a big deal and it directly impacts the lag between what the console outputs and what you're actually seeing.


Screen refresh rate and game frame rates are not the same thing.

You're not going to notice much difference in gaming framerates aboce 60fps.   After that its really about frame times instead. Average frame rarte of 60FPS doesn't necessarily mean they are presented on the screen in even time increments.   And if your TV refresh rate is very different than your game frame rate - that's the cause of screen tearing.  Thats why technologies like Gsync and Freesync exist on the PC side.

 



niallyb

Around the Network
JRPGfan said:
Waste of resources to go beyound 60 fps imo.

All you need is a Freesync / G-sync like feature and 60 fps becomes so damn enjoyable you really wouldnt complainn at all about it not being higher.


This !  You are exactly correct sir.    On the console side though I would expect the manufacturers to just lock the frame rate.  There's no real chance of variable/adaptive refresh rate TV's becoming the standard. Its a niche requirement that only benefits gamers (beautifully) but has no real application when it comes to watching TV or movies.



niallyb

SpokenTruth said:
Lawlight said:
SpokenTruth said:
Lawlight said:
Probably yes since graphics themselves have plateau'd while machines are getting more powerful.

No, they haven't. Besides that, devs will use the greater power for more geometry, A.I., physics, A.A., etc...


Yes, they have. Otherwise, your 3-year video card wouldn't have been able to keep up.

So your 3 year old card plays the most demanding games today on ultra settings at the same frame rate if was playing the most demanding games 3 years ago? 

I doubt that.  A Radeon HD 7970 or GeForce GTX 680 from 2012 is not going to run Battlefield Hardline as well as they ran Battlefield 3.   Just because they can still run it doesn't mean graphics have stagnated.  Try using either of those on The Witcher 3 and see how well they keep up as you state.

I'm not a PC gamer but they claim that a 3 year old card would work just as well.



SpokenTruth said:
Lawlight said:

I'm not a PC gamer but they claim that a 3 year old card would work just as well.

Oh sure, that.  Yes, you can still use a top of the line 3 year old card but you're going to have to reduce settings or resolution to play the most recent games at 60fps.  Mid range cards are going to struggle pretty hard without major setting and resolution changes.

So yes, you can still use them just don't expect the same experience as a high end modern card. 

Graphics haven't stagnated on PC but you've always had the ability to still play new games with older cards just with reduced settings.

 

But on point with the topic, frame rate is a balance on the hardware.  Doesn't matter how powerful the hardware is, a developer can bring it to a crawling slideshow if they try to force more than it can handle (or code it poorly).

That said, I insist on the fact that graphics have stagnated. It's apparent just by looking at the games themselves.



spemanig said:
Tachikoma said:

So long as 4K is being dangled before developers likr the proverbial carrot, and the next big thing to slap on a game box before your competitors, you can say goodbye to 60fps let alone 120.

Personal note: those tv's they have in electronics stores with a sports match playing, and showing off the tv's 200+hz "fluid motion" crap, I always find those to be jarringly weird, even though it isn't really possible they feel like they're running smoother than real life, its hard to explain.


I think it has something to do with the difference between the way cameras capture footage and the way our brains effect how we percieve the world. High framerate cameras, as far as I can tell, don't capture or add motion blur. As far as I am aware, that's something our brains naturally add to fast moving objects in the world to help us comprehend them better, but since that's not the way high framerate video is displayed at all, its just flashing lights, our brains can't do that motion blur fill in, making everything look weird. Basically, if we saw the world the way these cameras capture the world, when you waved your hand in front of your face, instead of seeing a big blurry mess of skin, you'd see the very detailed and accurate movement of your hand rapidly moving back and forth with substantially less to completely nonexistant blur.

In a way, it sort of is smoother than in real life, because our brains can't cheat for us.

I think you've got that backwards. Any motion blur our eyes add they can add just as well to moving objects on a screen. It's how far objects move through your visual field while the light is being collected on you retina that cause blur. You don't have motion blur on non rotating objects that you follow with you eyes. Which is probably why 120hz/240hz smooth motion looks weird. The motion blur is in the original footage and gets smeared out over multiple frames. Now it runs at 120 or even 240 fps yet when you follow the ball with your eyes, it is still blurred as captured by the original 60 fps camera. Maybe that even interferes with your motion perception that reacts to clear lines and shapes for movement. Instead of your brain seeing a blurry object 'step' through your visual field and interfering motion from that, now it can't see the steps any more and it becomes a blur smoothly moving along.

I don't think there's a difference between waving your hand in front of you and watching a high fps recording of waving your hand in front of you. Depending on the light source of course, waving your hand in front of a CRT monitor was an easy test for flicker :)

It probably mostly has to do with adjusting to 120 fps recordings. From birth your brain has learned how to deal with 24, 30 and 60fps. 120fps is new and apparently still not high enough to remove the distinction between looking out a window and looking at a 120fps screen. Back before LCD I used to game at upto 100hz (CRT doesn't really become flicker free below 100) I didn't notice anything weird about it. That was before motion blur was added to games.



Around the Network
SpokenTruth said:
Lawlight said:
SpokenTruth said:
Lawlight said:

I'm not a PC gamer

That said, I insist on the fact that graphics have stagnated. It's apparent just by looking at the games themselves.

Seriously?   You're going to claim they look the same without actually playing any of them?

I suppose the physics look the same too? 

I don't need to play a game to see how it looks. Yeah, what are the games that have changed physics so much these past few years?



Finally, I have been asking this question on youtube to people all the time. I am just intersted in the progress of the industry. 

I think for me faster games like Destiny should be at a higher framerate because their animations are complicated or characters travel distances quickly. It being only 30 fps was a major flaw to me. I read some youtuber joke that it should be 120 fps with all the crap on the screen and jump mechanics. Although the old Halo games never bothered me at 30 fps and I don't even feel much of a difference with them being 60 fps.



I just want a FIXED framerate that's good.



 

-----------------------------------------------------------------------------------

12/22/2016- Made a bet with Ganoncrotch that the first 6 months of 2017 will be worse than 2016. A poll will be made to determine the winner. Loser has to take a picture of them imitating their profile picture.

SpokenTruth said:
Lawlight said:
SpokenTruth said:
Lawlight said:

That said, I insist on the fact that graphics have stagnated. It's apparent just by looking at the games themselves.

Seriously?   You're going to claim they look the same without actually playing any of them?

I suppose the physics look the same too? 

I don't need to play a game to see how it looks. Yeah, what are the games that have changed physics so much these past few years?

Yeah, you kind of need to actually play them to understand the difference between a game today and a game several years ago.

Better stated, has the minimum and suggested requirements for PC games stayed the same over the past 3 years?  No, they haven't.  Given that fact, please enlighten us how with just a glance at a game (and never playing any) you can account for the increase in requirements?


Minimum requirements don't mean anything as it doesn't take into consideration optimization, which would explain why some games that came out years before are more demanding than games released now.

And if you can't show me the difference in screenshots, then graphics haven't gotten noticeably better.



Cheeky answer:
When you buy a gaming pc.